SQM Accenture M5 070908
SQM Accenture M5 070908
[Module 5]
By PradeepKumar KS
Contents
• Defect Prevention Concepts
• Defect Type Classification
• Defect Prevention Approach
2
What Are Defects?
• Defect is:
– Any variance from a desired product attribute/
specification
– Any undesirable feature of a work product
– Any variance from end user/customer expectations
• Detracts from the software’s ability to completely
and effectively meet customer’s needs
• Can be identified, described and counted
3
Origin of Defects
• Human errors
• Tools
• Methods
• Environment
• Defects are injected during all phases of software
development:
– Requirements 56%
– Design 27%
– Coding 7%
– Others 10%
5
Cost of Defect Removal
80
75
70
Relative cost to fix the defect
60
50
40
30
25
20
10
6
1 2.5
0
Requirement Design Coding Testing Installation
6
Defect Prevention
• Defect Prevention is the process of identifying
the root causes of the defects and creating
action plans that prevent the re-occurrence of
such defects.
7
Defect Classification
• By Severity
– Fatal
– Major
– Minor
– Cosmetic
• By Origin
– Requirements
– Design
– Implementation
– Test phase errors
– Inspection and checkout errors
– Maintenance errors
8
Defect Classification (Contd.)
• By Priority
– Needs immediate resolution
– High priority
– Low priority
• Defect Type
– GUI
– Logic
– Exception handling
– Interface
• Orthogonal Defect Classification
9
Orthogonal Defect Classification (ODC)
• ODC is a concept that enables in-process feedback to
developers by extracting signatures from defects to
extract cause-effect relationships in the development
process.
• The classification is to be based on what was known
about the defect such as its defect type or trigger and
not on opinion such as where it was injected.
• Defect type identifies what is corrected and can be
associated with the different stages of the process.
Thus, a set of defects from different stages in the
process, classified according to an orthogonal set of
attributes, should bear the signature of this stage in its
distribution.
10
ODC (Contd.)
• The design of the defect trigger attribute to provide a
measure of the effectiveness of a verification stage.
Defect triggers capture the circumstance that allowed
the defect to surface. Taken together with the defect
type, the cross-product of defect type and trigger
provides information that can estimate the
effectiveness of the process.
• Orthogonal Defect Classification (ODC) essentially
means that we categorize a defect into classes that
collectively point to the part of the process which needs
attention, much like characterizing a point in a
Cartesian system of orthogonal axes by its (x, y, z)
coordinates.
11
Defect Type Classification
• Function (Class/Object)
• Assignment (Initialization)
• Interface
• Checking
• Timing/ serialization
• Build/ package/ merge
• Documentation
• Algorithm (Method)
12
Function / Class / Object
• A function error is one that
– Affects significant capability, end-user interfaces, product
interfaces, interfaces with hardware architecture, or global data
structure
– Requires a formal design change
• Examples
– Faults in database design – e.g. A database did not include a
field for street address, although the requirements specified it.
– Incomplete/incorrect model (prototype, use case, class diagram)
– e.g. A class was omitted during system design
– Steps in improper sequence (activity diagram) – e.g. browser
back button not taking you to the right back screen, screen
refresh not occurring, etc.
13
Assignment / Initialization
• An assignment error indicates:
– A few lines of code, such as the initialization of control blocks or
data structure.
• Examples
– Missing initialization of variables, parameters, control blocks,
state variables, objects, etc.
– Wrong initialization of variables, parameters, control blocks, state
variables, objects, etc.
– Incorrect setting/resetting of variables, control block, state
variables, objects, flags, etc.
– Improper acquiring and releasing resources like memory, DB
connection, etc.
– Default enable/disable of GUI elements not implemented right
– Default screens not coming up right
– Default/alert focus not at the right field
– Improper combo options, default combo value, etc.
14
Interface
• Interface error corresponds to errors in interacting with
other components, modules or device drivers via macros,
call statements, control blocks, or parameter lists.
• Examples
– Incorrect function invocation
– Incorrect macro usage
– Incorrect object interfacing – e.g. the number and types of
parameters of the OO-message do not confirm with the signature
of the requested service
– Incorrect service invoked
– Incorrect sharing of resources – like session variables, files,
memory, etc.
– Browser incompatibility issues
15
Checking
• Checking addresses program logic that has failed to
properly validate data and values before they are used.
• Examples
– Missing validation of parameters or data, like URL validation,
Email address validation, alpha/numeric/special character
validation, etc.
– Incorrect validation of parameters or data – e.g. conditional loop
should have stopped on the ninth iteration, but goes to tenth
– Missing exception handling
– Incorrect exception handling
– Missing GUI validation – leading truncated display, etc.
– Incorrect/No check for duplicate record/data.
– Handling multiple entry of same Email ID, multiple upload of
same file, etc.
16
Timing/serialization
• Examples
– Incorrect locking mechanisms
– Serialization is missing when making updates to a shared
control block.
17
Build/package/merge
18
Documentation
• Documentation errors can affect all deliverables of a
project.
• Examples
– Incorrect comment
– Grammar, semantic and typo errors
– Non-adherence to documentation standards
– Non-adherence to naming standards
– GUI formatting error – e.g. Help section missing for a tab in the
screen
– Ambiguous/Incorrect error messages
– Incorrect formatting of reports
19
Algorithm/Method
• Algorithm errors:
– Include efficiency or correctness problems that affect the task
– Can be fixed by (re) implementing an algorithm or local data
structure without the need for requesting a design change.
• Examples
– Incorrect implementation – wrong option chosen when alternate
implementation methods are possible – e.g. Linear linked list
instead of circular linked list
– Logic errors
– Un-optimized design/code
– Wrong SQL queries/updates
20
Defect Type - Process Association
Defect Type Where/When Found Where/When Introduced
Function FS/Prototype Review Functional Specification
High Level Design Prototype
Review High Level Design
System Testing
Acceptance Testing
Assignment Code Review Coding
Unit Testing
Integration/System
Testing
Acceptance Testing
Interface Low Level Design Functional Specification
Review High Level Design
Code Review Low Level Design
Unit Testing Coding
Integration/System
Testing
Acceptance Testing
21
Defect Type - Process Association (Contd.)
Defect Type Where/When Found Where/When
Introduced
Checking Low Level Design Low Level Design
Review Coding
Code Review
Unit Testing
Integration/System
Testing
Acceptance Testing
Timing/ Low Level Design Low Level Design
serialization Review
Code Review
Unit Testing
Integration/System
Testing
Acceptance Testing
Build/ Integration/System Configuration Control
package/ Testing (Including packaging &
merge Acceptance Testing Delivery)
22
Defect Type - Process Association (Contd.)
Defect Type Where/When Found Where/When
Introduced
Documentation All Phases All Phases
23
Approach to Defect Prevention
• Monitoring/Analyzing Defect Data
• Defect Classification ( based on defect attributes)
• Defined set of Root Causes
• Defect prevention activities at two levels:
– Organization
– Project
• Plan for Defect Prevention
• Prepare/Implement Preventive Actions
• Tracking Status of Defect Prevention activities
• Review of Defect Prevention activities
24
Defect Prevention Planning
• Defect Prevention Council plans defect
prevention activities at organization level
– Inputs: Current trends of defects; project defect data
– Output: Organization Defect Prevention Plan
• Project Manager plans for project-specific defect
prevention activities
– Inputs: Organizational Defect prevention plan, defect
data/defect prevention plans from similar projects
– Output: Defect Prevention Plan section in Software
Project Plan
25
Monitoring Defect Data
• SQA responsible for monitoring defect data at
organization level
• Monthly Root Cause Analysis based on:
– Defect data for the month (from IRRs and DTS)
– Projects’ Root Cause Analysis Reports
• Organization Root Cause Analysis Report
published (by the SQA Team)
26
Monitoring Defect Data (Contd.)
• PM & PQA responsible for monitoring defect
data at project level
• Monthly/Milestone Root Cause Analysis based
on:
– Defect data for the month or period/phase (from IRRs
and DTS)
• Prepare Project Root Cause Analysis Report
• Project Root Cause Analysis Report and
Organizational Root Cause Analysis Report
used for re-planning defect prevention activities
in the project
27
Track Status of DP Activities
• Project Level
– Phase end metrics
– Monthly metrics
• Organization Level
– Baseline reports
28
Review of Defect Prevention Activities
• Project Review meeting
• Weekly Review meeting
• Monthly Operations Review meeting
• Quarterly QMS Review meeting
29
Defect Database
• IRRS and DTS constitute the Defect Database
• Attributes of a defect
– Defect ID/No.
– Defect Description
– Defect Severity (1 = Fatal; 2 = Major; 3 = Minor; 4 = Cosmetic)
– Defect Priority (1 = High; 2 = Medium; 3 = Low)
– Phase of Detection
– Phase of Origin
– Defect Type (ODC list)
– How fixed
– Root Cause (Root cause list)
– Remarks (for additional information)
30
Root Cause Analysis – Project Level
• Inputs to Root Cause Analysis: IRRS and DTS
• For each defect in IRRs and DTS:
– Associate a defect type (based on ODC)
– Associate the Root Cause – why the defect was introduced
– Phase where it was introduced
• For the period of interest – a milestone:
– Phase end and other milestones for Development & Migration projects
– Monthly for Maintenance projects
• Group defects by Phase of Origin and Defect Type and
arrive at the following Table
Phase of Origin Defect Type #Defects
-- -- --
-- -- --
31
Root Cause Analysis (Contd.)
• For each “Phase of Origin” perform Pareto Analysis
• Identify the Defect Types that contribute to 80% of the
defects (“Top-80-Defect-Types”)
• For each of the “Top-80-Defect-Types”, group defects by
Root Cause and arrive at the following Table
Defect Type Root Cause #Defects
-- -- --
-- -- --
• For each of “Defect Type” perform Pareto Analysis
• Identify the Root Causes that contribute to 80% of the
defects (“Top-80-Root-Causes”)
32
Root Cause Analysis (Contd.)
• For each of the “Top-80-Root-Causes”, identify
preventive actions:
– Brainstorm
– Prepare Cause & Effect Diagram
– Prepare Preventive Action Plan
33
Root Causes
• Bad fix
• External cause
• Inadequate configuration management
• Inadequate domain knowledge
• Inadequate process
• Inadequate software/language/tools skills
• Inadequate self review
• Incomplete/Unclear inputs
• Process non-compliance
• Traceability/Transcription errors
34
Root Causes (Contd.)
• Bad fix
– A fix to an existing problem, leading to further defects due to
insufficient impact analysis or wrong implementation. Usually
encountered in maintenance projects or during testing phase.
• External cause
– An external system misbehaves thereby providing an impression
as if our application has defect. The fix for this would have been
to set right the other system.
• Inadequate configuration management
– Packaging problems, missing components, files in wrong
directories, wrong versions of files used, etc.
• Inadequate domain knowledge
– Better picture of the business domain would have ensured that
these types of defects do not get introduced.
35
Root Causes (Contd.)
• Inadequate process
– Absence of standards/checklists/procedures/guidelines,
inadequate standards/checklists/procedures/guidelines
at both organization and project level.
• Inadequate software/language/tools skills
– Lack of expertise in OS, App server, Web server,
RDBMS, Testing/Modeling/Documenting Tools,
Analysis/Design/Coding skills, Language skills, etc.
• Inadequate self review
– Absence of any of the other root cause. Problem solely
due to human errors, oversight, copy-paste errors,
inadequate self review, etc.
36
Root Causes (Contd.)
• Incomplete/Unclear inputs
– Incomplete inputs or lack of clarity in inputs to the particular
activity – typically issues with FS/Prototype while doing design,
Absence of clarity in test cases while testing, etc.
• Process non-compliance
– Clearly defined standards/checklists/procedures/guidelines at
both organization and project level, but the same was not
adhered to while carrying out activities.
• Traceability/Transcription errors
– Input understood incorrectly and hence transcription not proper or
incorrect implementation of designed functionality. Lack of
traceability with respect to previous work items.
37
Using ODC for Defect Prevention
• Defect type - Process association
• Estimation of defects
• Defect Prevention activities
38
References
• Orthogonal Defect Classification – A Concept for In-
process Measurements
Ram Chillarege, Inderpal S. Bhandari, Jarir K Chaar, Michael J
Halliday, Diane S. Moebus, Binnie K. Ray, and Man-Yuen Wong
(IEEE Transactions on Software Engineering, Vol. 18, No. 11,
November 1992)
39
END of MODULE 5