5 Defect Management
5 Defect Management
DEFECT
MANAGEMENT
CONTENTS
Anuradha Bhatia 1
Software Testing
I. Introduction
i. Software defects are expensive.
ii. The cost of finding and correcting defects represents one of the most
expensive software development activities.
iii. While defects may be inevitable, we can minimize their number and impact
on our projects.
iv. To do this development teams need to implement a defect management
process that focuses on preventing defects, catching defects as early in the
process as possible, and minimizing the impact of defects.
v. A little investment in this process can yield significant returns.
1. Defect Classification
(Question: Explain the defect classification. – 8 Marks)
ii. In other words, a defect is an error in coding or logic that causes a program to
malfunction or to produce incorrect/unexpected results.
Severity Wise:
i. Major: A defect, which will cause an observable product failure or departure
from requirements.
ii. Minor: A defect that will not cause a failure in execution of the product.
iii. Fatal: A defect that will cause the system to crash or close abruptly or effect
other applications.
Anuradha Bhatia 2
Software Testing
Anuradha Bhatia 3
Software Testing
Status Wise:
i. Open
ii. Closed
iii. Deferred
iv. Cancelled
iv. Defect Resolution-- Work by the development team to prioritize, schedule and
fix a defect, and document the resolution. This also includes notification back
to the tester to ensure that the resolution is verified.
Anuradha Bhatia 4
Software Testing
NEW
ASSIGNED OPEN
DEFERRED
DROPPED REJECTED
Anuradha Bhatia 5
Software Testing
REASSIGNED REOPENED
CLOSED VERIFIED
Table 1: Defect Status
Defect Status Explanation
i. NEW: Tester finds a defect and posts it with the status NEW. This defect is yet
to be studied/approved. The fate of a NEW defect is one of ASSIGNED,
DROPPED and DEFERRED.
ii. ASSIGNED / OPEN: Test / Development / Project lead studies the NEW defect
and if it is found to be valid it is assigned to a member of the Development
Team. The assigned Developer’s responsibility is now to fix the defect and have
it COMPLETED. Sometimes, ASSIGNED and OPEN can be different statuses. In
that case, a defect can be open yet unassigned.
iii. DEFERRED: If a valid NEW or ASSIGNED defect is decided to be fixed in
upcoming releases instead of the current release it is DEFERRED. This defect is
ASSIGNED when the time comes.
iv. DROPPED / REJECTED: Test / Development/ Project lead studies the NEW
defect and if it is found to be invalid, it is DROPPED / REJECTED. Note that the
specific reason for this action needs to be given.
v. COMPLETED / FIXED / RESOLVED / TEST: Developer ‘fixes’ the defect that is
ASSIGNED to him or her. Now, the ‘fixed’ defect needs to be verified by the
Test Team and the Development Team ‘assigns’ the defect back to the Test
Team. A COMPLETED defect is either CLOSED, if fine, or REASSIGNED, if still
not fine.
vi. If a Developer cannot fix a defect, some organizations may offer the following
statuses:
Won’t Fix / Can’t Fix: The Developer will not or cannot fix the defect due
to some reason.
Can’t Reproduce: The Developer is unable to reproduce the defect.
Need More Information: The Developer needs more information on the
defect from the Tester.
vii. REASSIGNED / REOPENED: If the Tester finds that the ‘fixed’ defect is in fact
not fixed or only partially fixed, it is reassigned to the Developer who ‘fixed’ it.
A REASSIGNED defect needs to be COMPLETED again.
viii. CLOSED / VERIFIED: If the Tester / Test Lead finds that the defect is indeed
fixed and is no more of any concern, it is CLOSED / VERIFIED. This is the happy
ending.
Anuradha Bhatia 6
Software Testing
2. Defect Template
(Question: Create the bug template for a login form. – 4Marks)
i. Reporting a bug/defect properly is as important as finding a defect.
ii. If the defect found is not logged/reported correctly and clearly in bug tracking
tools (like Bugzilla, ClearQuest etc.) then it won’t be addressed properly by the
developers, so it is very important to fill as much information as possible in the
defect template so that it is very easy to understand the actual issue with the
software.
1. Sample defect template
Abstract :
Platform :
Testcase Name :
Release :
Build Level :
Client Machine IP/Hostname :
Client OS :
Server Machine IP/Hostname :
Server OS :
Defect Type :
Priority :
Sevierity :
Developer Contacted :
Test Contact Person :
Attachments :
Any Workaround :
Steps to Reproduce
1.
2.
3.
Expected Result:
Actual Result:
Anuradha Bhatia 7
Software Testing
Module Specific module of the product where the defect was detected.
Detected Build Build version of the product where the defect was detected
Version (e.g. 1.2.3.5)
Actual Result The actual result you received when you followed the steps.
Anuradha Bhatia 8
Software Testing
Build version of the product where the defect was fixed (e.g.
Fixed Build Version 1.2.3.9)
Anuradha Bhatia 9
Software Testing
a) Estimating Defects
i. Intuitively the number of maximum potential defects is equal to the
number of acceptance test cases which is 1.2 x Function Points.
Anuradha Bhatia 10
Software Testing
e) Defect Prevention
Anuradha Bhatia 11
Software Testing
a) Quick Attacks:
i. Strengths
The quick-attacks technique allows you to perform a cursory analysis
of a system in a very compressed timeframe.
Even without a specification, you know a little bit about the software,
so the time spent is also time invested in developing expertise.
The skill is relatively easy to learn, and once you've attained some
mastery your quick-attack session will probably produce a few bugs.
Finally, quick attacks are quick.
They can help you to make a rapid assessment. You may not know the
requirements, but if your attacks yielded a lot of bugs, the
programmers probably aren't thinking about exceptional conditions,
and it's also likely that they made mistakes in the main functionality.
If your attacks don't yield any defects, you may have some confidence
in the general, happy-path functionality.
ii. Weaknesses
Quick attacks are often criticized for finding "bugs that don't matter"—
especially for internal applications.
While easy mastery of this skill is a strength, it creates the risk that
quick attacks are "all there is" to testing; thus, anyone who takes a two-
day course can do the work.
i. Strengths
Boundaries and equivalence classes give us a technique to reduce an
infinite test set into something manageable.
They also provide a mechanism for us to show that the requirements
are "covered".
ii. Weaknesses
The "classes" in the table in Figure 1 are correct only in the mind of the
person who chose them.
Anuradha Bhatia 12
Software Testing
d) State-Transition Diagrams
Anuradha Bhatia 13
Software Testing
i. Strengths
Mapping out the application provides a list of immediate, powerful test
ideas.
Model can be improved by collaborating with the whole team to find
"hidden" states—transitions that might be known only by the original
programmer or specification author.
Once you have the map, you can have other people draw their own
diagrams, and then compare theirs to yours.
The differences in those maps can indicate gaps in the requirements,
defects in the software, or at least different expectations among team
members.
ii. Weaknesses
The map you draw doesn't actually reflect how the software will
operate; in other words, "the map is not the territory."
Drawing a diagram won't find these differences, and it might even give
the team the illusion of certainty.
Like just about every other technique on this list, a state-transition
diagram can be helpful, but it's not sufficient by itself to test an entire
application.
Use cases and scenarios focus on software in its role to enable a human being
to do something.
i. Strengths
Use cases and scenarios tend to resonate with business customers, and
if done as part of the requirement process, they sort of magically
generate test cases from the requirements.
Anuradha Bhatia 14
Software Testing
ii. Weaknesses
Soap opera tests have the opposite problem; they're so complex that
if something goes wrong, it may take a fair bit of troubleshooting to
find exactly where the error came from!
Imagine that you have a black-box recorder that writes down every single line of
code as it executes.
i. Strengths
Programmers love code coverage. It allows them to attach a number—
an actual, hard, real number, such as 75%—to the performance of their
unit tests, and they can challenge themselves to improve the score.
Meanwhile, looking at the code that isn't covered also can yield
opportunities for improvement and bugs!
ii. Weaknesses
Customer-level coverage tools are expensive, programmer-level tools
that tend to assume the team is doing automated unit testing and has
a continuous-integration server and a fair bit of discipline.
After installing the tool, most people tend to focus on statement
coverage—the least powerful of the measures.
Even decision coverage doesn't deal with situations where the decision
contains defects, or when there are other, hidden equivalence classes;
say, in the third-party library that isn't measured in the same way as
your compiled source code is.
Having code-coverage numbers can be helpful, but using them as a
form of process control can actually encourage wrong behaviours. In
my experience, it's often best to leave these measures to the
programmers, to measure optionally for personal improvement (and
to find dead spots), not as a proxy for actual quality.
People spend a lot of money on regression testing, taking the old test
ideas described above and rerunning them over and over.
Anuradha Bhatia 15
Software Testing
It is essential that you report defects effectively so that time and effort is not
unnecessarily wasted in trying to understand and reproduce the defect. Here are
some guidelines:
i. Be specific:
Specify the exact action: Do not say something like ‘Select Button B’.
Do you mean ‘Click Button B’ or ‘Press ALT+B’ or ‘Focus on Button B and
click ENTER’.
Anuradha Bhatia 16
Software Testing
In case of multiple paths, mention the exact path you followed: Do not say
something like “If you do ‘A and X’ or ‘B and Y’ or ‘C and Z’, you get D.”
Understanding all the paths at once will be difficult. Instead, say “Do ‘A and
X’ and you get D.” You can, of course, mention elsewhere in the report that
“D can also be got if you do ‘B and Y’ or ‘C and Z’.”
Do not use vague pronouns: Do not say something like “In Application A,
open X, Y, and Z, and then close it.” What does the ‘it’ stand for? ‘Z’ or, ‘Y’,
or ‘X’ or ‘Application A’?”
ii. Be detailed:
Provide more information (not less). In other words, do not be lazy.
Developers may or may not use all the information you provide but they
sure do not want to beg you for any information you have missed.
iii. Be objective:
Do not make subjective statements like “This is a lousy application” or “You
fixed it real bad.”
Stick to the facts and avoid the emotions.
iv. Reproduce the defect:
Do not be impatient and file a defect report as soon as you uncover a
defect. Replicate it at least once more to be sure.
v. Review the report:
Do not hit ‘Submit’ as soon as you write the report.
Review it at least once.
Remove any typing errors.
Anuradha Bhatia 17