CH 15
CH 15
Product Metrics
Moonzoo Kim
CS Division of EECS Dept.
KAIST
moonzoo@cs.kaist.ac.kr
http://pswlab.kaist.ac.kr/courses/cs550-07
CS550 Intro. to SE
Spring 2007 1
Overview of Ch15. Product Metrics
15.1 Software Quality
15.2 A Framework for Product Metrics
15.3 Metrics for the Analysis Model
Function point metrics
15.4 Metrics for the Design Model
Architectural design metrics
Metrics for OO design
Class-oriented metrics
Component-level design metrics
Operation oriented metrics
15.5 Metrics for Source Code
15.6 Metrics for Testing
15.7 Metrics for Maintenance
CS550 Intro. to SE
Spring 2007 2
McCall’s Triangle of Quality (1970s)
Maintainability Portability
Flexibility Reusability
Testability Interoperability
PRODUCT REVISION PRODUCT TRANSITION
PRODUCT OPERATION
Correctness Usability Efficiency
Reliability Integrity
SW built to conform to these factors will exhibit high quality,
even if there are dramatic changes in technology.
CS550 Intro. to SE
Spring 2007 3
Measures, Metrics and Indicators
CS550 Intro. to SE
Spring 2007 4
Measurement Principles
The objectives of measurement should be
established before data collection begins
Each technical metric should be defined in an
unambiguous manner
Metrics should be derived based on a theory that is
valid for the domain of application
Metrics for design should draw upon basic design concepts
and principles and attempt to provide an indication of the
presence of a desirable attribute
Metrics should be tailored to best accommodate specific
products and processes
CS550 Intro. to SE
Spring 2007 5
Measurement Process
Formulation.
The derivation of software measures and metrics appropriate for the
representation of the software that is being considered.
Collection
The mechanism used to accumulate data required to derive the formulated
metrics.
Analysis.
The computation of metrics and the application of mathematical tools.
Interpretation.
The evaluation of metrics results in an effort to gain insight into the quality of
the representation.
Feedback.
Recommendations derived from the interpretation of product metrics
transmitted to the software team.
CS550 Intro. to SE
Spring 2007 6
Goal-Oriented Software Measurement
The Goal/Question/Metric Paradigm
establish an explicit measurement goal
define a set of questions that must be answered to achieve the goal
identify well-formulated metrics that help to answer these questions.
Goal definition template
Analyze
{the name of activity or attribute to be measured}
for the purpose of
{the overall objective of the analysis}
with respect to
{the aspect of the activity or attribute that is considered}
from the viewpoint of
{the people who have an interest in the measurement}
in the context of
{the environment in which the measurement takes place}.
CS550 Intro. to SE
Spring 2007 7
Ex> Goal definition for SafeHome
Analyze the Safehome SW architecture for the purpose of
evaluating architectural components with respect to the ability to
make Safehome more extensible from the viewpoint of the SW
engineers performing the work in the context of produce
enhancement over the next 3 years
Questions
Q1: Are architectural components characterized in a manner that
compartmentalizes function and related data?
Answer: 0 … 10
Q2: Is the complexity of each component within bounds that will
facilitate modification and extension?
Answer: 0 … 1
CS550 Intro. to SE
Spring 2007 8
Metrics Attributes
simple and computable.
It should be relatively easy to learn how to derive the metric, and
its computation should not demand inordinate effort or time
empirically and intuitively persuasive.
The metric should satisfy the engineer’s intuitive notions about
the product attribute under consideration
consistent and objective.
The metric should always yield results that are unambiguous.
consistent in its use of units and dimensions.
The mathematical computation of the metric should use
measures that do not lead to bizarre combinations of unit.
an effective mechanism for quality feedback.
That is, the metric should provide a software engineer with
information that can lead to a higher quality end product
CS550 Intro. to SE
Spring 2007 9
Collection and Analysis Principles
CS550 Intro. to SE
Spring 2007 10
Function-Based Metrics
The function point metric (FP), first proposed by Albrecht [ALB79],
can be used effectively as a means for measuring the functionality
delivered by a system.
Function points are derived using an empirical relationship based on
countable (direct) measures of software's information domain and
assessments of software complexity
Information domain values are defined in the following manner:
number of external inputs (EIs)
number of external outputs (EOs)
number of external inquiries (EQs)
number of internal logical files (ILFs)
Number of external interface files (EIFs)
CS550 Intro. to SE
Spring 2007 11
Function Points
Information Weighting factor
Domain Value Count simple average complex
3 5 7 10 =
External Interface Files ( EIFs)
Count total
CS550 Intro. to SE
Spring 2007 14
Architectural Design Metrics (black box)
Architectural design metrics
Structural complexity of a module m= (# of fan-out of module m)2
Data complexity = (# of input & output variables)/ (fan-out+1)
System complexity = structural complexity + data complexity)
Morphology metrics: a function of the number of modules
and the number of interfaces between modules
Size, depth, width, arc-to-node ratio
CS550 Intro. to SE
Spring 2007 15
Metrics for OO Design-I
Whitmire [WHI97] describes nine distinct and measurable
characteristics of an OO design:
Size
Size is defined in terms of four views: population, volume, length, and
functionality
Complexity
How classes of an OO design are interrelated to one another
Coupling
The physical connections between elements of the OO design
Sufficiency
“the degree to which an abstraction possesses the features required of it, or the
degree to which a design component possesses features in its abstraction,
from the point of view of the current application.”
Completeness
An indirect implication about the degree to which the abstraction or design
component can be reused
CS550 Intro. to SE
Spring 2007 16
Metrics for OO Design-II
Cohesion
The degree to which all operations working together to achieve a
single, well-defined purpose
Primitiveness
Applied to both operations and classes, the degree to which an
operation is atomic
Similarity
The degree to which two or more classes are similar in terms of
their structure, function, behavior, or purpose
Volatility
Measures the likelihood that a change will occur
CS550 Intro. to SE
Spring 2007 17
Distinguishing Characteristics
Berard [BER95] argues that the following characteristics require
that special OO metrics be developed:
Localization
the way in which information is concentrated in a program
Encapsulation
the packaging of data and processing
Information hiding
the way in which information about operational details is hidden by a
secure interface
Inheritance
the manner in which the responsibilities of one class are propagated to
another
Abstraction
the mechanism that allows a design to focus on essential details
CS550 Intro. to SE
Spring 2007 18
Class-Oriented Metrics
Proposed by Chidamber and Kemerer (CK metrics):
weighted methods per class
∑(mi) where mi is a normalized complexity for method i
depth of the inheritance tree
number of children
coupling between object classes
response for a class
lack of cohesion in methods
CS550 Intro. to SE
Spring 2007 19
Applying CK Metrics (pg483-484)
The scene: Shakira: Wasn't too complicated. I
Vinod's cubicle. went back to my UML class and
The players: sequence diagrams, like you
Vinod, Jamie, Shakira, Ed suggested, and got rough counts
members of the SafeHome software for DIT, RFC, and LCOM. I couldn't
engineering team, who are continuing find the CRC model, so I didn't
work on component-level design and test
case design.
count CBO.
The conversation: Jamie (smiling): You couldn't find
the CRC model because I had it.
Vinod: Did you guys get a chance
to read the description of the CK Shakira: That's what I love about
metrics suite I sent you on this team, superb communication.
Wednesday and make those Vinod: I did my counts . . . did you
measurements? guys develop numbers for the CK
metrics?
CS550 Intro.
Intro.to
toSE
SE
Spring2007
Spring 2007 20
should look for classes that have bad
(Jamie and Ed nod in the affirmative.)
numbers in at least two or more of the
Jamie: Since I had the CRC cards, I CK metrics. Kind of two strikes and
took a look at CBO, and it looked you're modified.
pretty uniform across most of the
Shakira (looking over Ed's list of
classes. There was one exception,
classes with high RFC): Look, see
which I noted.
this class? It's got a high LCOM m
Ed: There are a few classes where well as a high RFC. Two strikes?
RFC is pretty high, compared with the
Vinod: Yeah I think so . . . it'll be
averages . . . maybe we should take a
difficult to implement because of
look at simplifying them.
complexity and difficult to test for the
Jamie: Maybe yes, maybe no. I'm still same reason. Probably worth
concerned about time, and I don't designing two separate classes to
want to fix stuff that isn't really broken. achieve the same behavior.
Vinod: I agree with that. Maybe we Jamie: You think modifying it'll save
us time?
Vinod: Over the long haul, yes.
CS550 Intro.
Intro.to
toSE
SE
Spring2007
Spring 2007 21
Class-Oriented Metrics
The MOOD Metrics Suite
CS550 Intro. to SE
Spring 2007 22
Class-Oriented Metrics
Proposed by Lorenz and Kidd [LOR94]:
class size
number of operations overridden by a subclass
number of operations added by a subclass
CS550 Intro. to SE
Spring 2007 23
Component-Level Design Metrics
Cohesion metrics
a function of data objects and the locus of their definition
Coupling metrics
a function of input and output parameters, global
variables, and modules called
Complexity metrics
hundreds have been proposed (e.g., cyclomatic
complexity)
CS550 Intro. to SE
Spring 2007 24
Operation-Oriented Metrics
Proposed by Lorenz and Kidd [LOR94]:
CS550 Intro. to SE
Spring 2007 25
Metrics for Testing
Testing effort can also be estimated using metrics derived
from Halstead measures
Binder [BIN94] suggests a broad array of design metrics
that have a direct influence on the “testability” of an OO
system.
Lack of cohesion in methods (LCOM).
Percent public and protected (PAP).
Public access to data members (PAD).
Number of root classes (NOR).
Fan-in (FIN).
Number of children (NOC) and depth of the inheritance tree (DIT).
CS550 Intro. to SE
Spring 2007 26
Metrics for Maintenance
IEEE Std 982.1-1998 Software Maturity index (SMI)
SMI = [MT - (Fa + Fc + Fd)]/MT
Mt = # of modules in the current release
Fc = # of modules in the current release that have been changed
Fa = # of modules in the current release that have been added
Fd = # of modules from the preceding release that were deleted in
the current release
CS550 Intro. to SE
Spring 2007 27