Lec16 ch30
Lec16 ch30
Product Metrics
1
McCall’s Triangle of Quality (1970s)
Maintainability Portability
Flexibility Reusability
Testability Interoperability
PRODUCT REVISION PRODUCT TRANSITION
PRODUCT OPERATION
Correctness Usability Efficiency
Reliability Integrity
2
Measures, Metrics and Indicators
A SW engineer collects measures and develops metrics so that
indicators will be obtained
A measure provides a quantitative indication of the extent, amount,
dimension, capacity, or size of some attribute of a product or process
The IEEE defines a metric as “a quantitative measure of the degree to
which a system, component, or process possesses a given attribute.”
IEEE Standard Glossary of Software Engineering Terminology (IEEE Std
610.12-1990)
An indicator is a metric or combination of metrics that provide insight into
the software process, a software project, or the product itself
Ex. Moonzoo Kim
Measure: height=170cm, weight=65 kg
Metric: fat metric= 0.38 ( =weight/height)
Indicator: normal health condition (since fat metric < 0.5 )
3
Measurement Principles
The objectives of measurement should be established before data
collection begins
Ex. It is useless for black-box testers to measure a # of words in a C file.
Ex. It is useful for C compiler developers to measure a # of words in a C file.
Each technical metric should be defined in an unambiguous manner
Ex. For measuring a total line number of a C program
Including comments? Including empty lines?
Metrics should be derived based on a theory that is valid for the domain
of application
Metrics for design should draw upon basic design concepts and principles
and attempt to provide an indication of the presence of a desirable attribute
Metrics should be tailored to best accommodate specific products and
processes
4
Measurement Process
Formulation. Example of Formulation
The derivation of software measures and To check whether a give software is hot-
metrics appropriate for the representation spotted (i.e. has intensive loops)
of the software that is being considered. Example of Collection
Collection Instrument a source program/binary to
The mechanism used to accumulate data count how many time a given statement is
required to derive the formulated metrics. executed in one second
Analysis. Example of Analysis.
The computation of metrics and the Using Excel/MatLab to get average
application of mathematical tools. numbers of executions of statements
Interpretation. Example of Interpretation.
The evaluation of metrics results in an If there exist statements which were
effort to gain insight into the quality of the executed more than 108 , on a 3 Ghz
representation. machine, then the program is hot-spotted
Feedback. Example of Feedback.
Recommendations derived from the Try to optimize those hot-spotted
interpretation of product metrics statements. Or those hot-spotted
transmitted to the software team. statement might have logical flaws
5
Goal-Oriented Software Measurement
The Goal/Question/Metric Paradigm
establish an explicit measurement goal
define a set of questions that must be answered to achieve the goal
identify well-formulated metrics that help to answer these questions.
Goal definition template
Analyze
{the name of activity or attribute to be measured}
for the purpose of
{the overall objective of the analysis}
with respect to
{the aspect of the activity or attribute that is considered}
from the viewpoint of
{the people who have an interest in the measurement}
in the context of
{the environment in which the measurement takes place}.
6
Ex> Goal definition for SafeHome
Analyze the Safehome SW architecture
for the purpose of evaluating architectural components
with respect to the ability to make Safehome more extensible
from the viewpoint of the SW engineers performing the work
in the context of produce enhancement over the next 3 years
Questions
Q1: Are architectural components characterized in a manner that
compartmentalizes function and related data?
Answer: 0 … 10
Q2: Is the complexity of each component within bounds that will
facilitate modification and extension?
Answer: 0 … 1
7
Metrics Attributes
Simple and computable.
It should be relatively easy to learn how to derive the metric,
and its computation should not demand inordinate effort or time
Empirically and intuitively persuasive.
The metric should satisfy the engineer’s intuitive notions about
the product attribute under consideration
Consistent and objective.
The metric should always yield results that are unambiguous.
Consistent in its use of units and dimensions.
The mathematical computation of the metric should use
measures that do not lead to bizarre combinations of unit.
ex. MZ measure of a software complexity: kg x m4
An effective mechanism for quality feedback.
That is, the metric should provide a software engineer with
information that can lead to a higher quality end product
8
Collection and Analysis Principles
Whenever possible, data collection and analysis should
be automated
Valid statistical techniques should be applied to establish
relationship between internal product attributes and
external quality characteristics
Interpretative guidelines and recommendations should
be established for each metric
Ex. Fat metric greater than 0.5 indicates obesity. A person who
has more than 0.7 fat metric should consult a doctor.
9
Overview of Ch30. Product Metrics
30.1 A Framework for Product Metrics
30.2 Metrics for the Requirement Model
Function point metrics
30.3 Metrics for the Design Model
Architectural design metrics
Metrics for OO design
Class-oriented metrics
Component-level design metrics
Operation oriented metrics
30.4 Design Metrics for Web and Mobile Apps
30.5 Metrics for Source Code
30.6 Metrics for Testing
30.7 Metrics for Maintenance
10
Metrics for the Analysis Model
These metrics examine the analysis model with the
intent of predicting the “size” of the resultant system
Size can be one indicator of design complexity
Size can always an indicator of increased coding,
integration, and testing efforts
Example
Function-based metrics
Metrics for specification quality
11
Function-Based Metrics
The function point metric (FP), first proposed by Albrecht [ALB79],
can be used effectively as a means for measuring the functionality
delivered by a system.
Function points are derived using an empirical relationship based on
countable (direct) measures of software's information domain and
assessments of software complexity
Information domain values are defined in the following manner:
number of external inputs (EIs)
often used to update internal logical files
number of external outputs (EOs)
number of external inquiries (EQs)
number of internal logical files (ILFs)
Number of external interface files (EIFs) (
12
Function Points
Information Weighting factor
Domain Value Count simple average complex
Count total
to the module
i.e. the # of modules that are directly invoked by the module
Data complexity = (# of input & output variables)/ (fan-out+1)
System complexity = structural complexity + data complexity
18
Morphology Metrics
Morphology metrics: a function of the number of modules
and the number of interfaces between modules
Size = n + a
Depth = the longest path from the root node to a leaf node
Width =maximum # of nodes at any one level of the architecture
Arc-to-node ratio
19
Metrics for OO Design-I
Whitmire [WHI97] describes nine distinct and measurable
characteristics of an OO design:
Size
Size is defined in terms of the following four views:
Population: a static count of OO entities such as classes
Volume: a dynamic count of OO entities such as objects
Length: a measure of a chain of interconnected design elements
Functionality: value delivered to the customer
Complexity
How classes of an OO design are interrelated to one another
Coupling
The physical connections between elements of the OO design
The # of collaborations between classes
Sufficiency
“the degree to which an abstraction possesses the features required of it, ...
from the point of view of the current application.”
Whether the abstraction (class) possesses the features required of it
20
Metrics for OO Design-II
Completeness
An indirect implication about the degree to which the abstraction or
design component can be reused
Cohesion
The degree to which all operations working together to achieve a
single, well-defined purpose
Primitiveness
Applied to both operations and classes, the degree to which an
operation is atomic
Similarity
The degree to which two or more classes are similar in terms of
their structure, function, behavior, or purpose
Volatility
Measures the likelihood that a change will occur
21
Distinguishing Characteristics
Berard [BER95] argues that the following characteristics require
that special OO metrics be developed:
Encapsulation
the packaging of data and processing
Information hiding
the way in which information about operational details is hidden by a
secure interface
Inheritance
the manner in which the responsibilities of one class are propagated to
another
Abstraction
the mechanism that allows a design to focus on essential details
Localization
the way in which information is concentrated in a program
22
Class-Oriented Metrics
Proposed by Chidamber and Kemerer (CK metrics):
Weighted methods per class ∑(Ci) where Ci is
a normalized complexity for method i
The # of methods and their complexity are reasonable
indicators of the amount of effort required to implement and
test a class
As the # of methods grows for a given class, it is likely to
become more application specific -> less reusability
Counting the # of methods is not trivial
Depth of the inheritance tree
As DIT grow, potential difficulties when attempting
to predict the behavior of a class
23
Class-Oriented Metrics
Number of children/subclasses (NOC)
As NOC grows, more reuse, but the abstraction of the parent class
is diluted
As NOC grows, the amount of testing will also increase
Coupling between object classes (CBO)
CBO is the # of collaborations listed on CRC index cards
As CBO increases, reusability decreases
Response for a class (RFC)
A set of methods that can be executed in response
to a request
As RFC increases, test sequence grows
Lack of cohesion in methods (LCOM)
A # of methods that access same attributes
24
Applying CK Metrics
The scene: Shakira: Wasn't too complicated. I
Vinod's cubicle. went back to my UML class and
The players: sequence diagrams, like you
Vinod, Jamie, Shakira, Ed suggested, and got rough counts
members of the SafeHome software for DIT, RFC, and LCOM. I couldn't
engineering team, who are continuing find the CRC model, so I didn't
work on component-level design and
test case design. count CBO.
The conversation: Jamie (smiling): You couldn't find
Vinod: Did you guys get a chance the CRC model because I had it.
to read the description of the CK Shakira: That's what I love about
metrics suite I sent you on this team, superb communication.
Wednesday and make those Vinod: I did my counts . . . did you
measurements? guys develop numbers for the CK
metrics?
25
should look for classes that have bad
(Jamie and Ed nod in the affirmative.)
numbers in at least two or more of the
Jamie: Since I had the CRC cards, I CK metrics. Kind of two strikes and
took a look at CBO, and it looked you're modified.
pretty uniform across most of the Shakira (looking over Ed's list of
classes. There was one exception,
classes with high RFC): Look, see
which I noted.
this class? It's got a high LCOM as
Ed: There are a few classes where well as a high RFC. Two strikes?
RFC is pretty high, compared with the
Vinod: Yeah I think so . . . it'll be
averages . . . maybe we should take a
difficult to implement because of
look at simplifying them.
complexity and difficult to test for the
Jamie: Maybe yes, maybe no. I'm still same reason. Probably worth
concerned about time, and I don't designing two separate classes to
want to fix stuff that isn't really broken. achieve the same behavior.
Vinod: I agree with that. Maybe we Jamie: You think modifying it'll save
us time?
Vinod: Over the long haul, yes.
26
Class-Oriented Metrics
The MOOD Metrics Suite
27
Class-Oriented Metrics
Proposed by Lorenz and Kidd [LOR94]:
class size
number of operations overridden by a subclass
number of operations added by a subclass
28
Component-Level Design Metrics
Cohesion metrics
a function of data objects and the locus of their definition
Coupling metrics
a function of input and output parameters, global
variables, and modules called
Complexity metrics
hundreds have been proposed (e.g., cyclomatic
complexity)
29
Operation-Oriented Metrics
Proposed by Lorenz and Kidd [LOR94]:
average operation size
# of messages sent by the operation
operation complexity
average number of parameters per operation
30
Metrics for Source Code
Halstead’s Software Science: a comprehensive
collection of metrics based on the number (count and
occurrence) of operators and operands within a
component or program
n1: # of distinct operators that appears in a program
n2: # of distinct operands that appears in a program
N1: # of operator occurrences
N2: # of operand occurrences
Program length N = n1 log2 n1 + n2 log2 n2
Program volume V= (N1+N2) log2 (n1 + n2)
And many more metrics
31
Cyclometic Complexity
• A quantitative measure of the logical
complexity
CS350 32
Metrics for Testing
Testing effort can also be estimated using metrics derived
from Halstead measures
Binder [BIN94] suggests a broad array of design metrics
that have a direct influence on the “testability” of an OO
system.
Lack of cohesion in methods (LCOM).
Percent public and protected (PAP).
Public access to data members (PAD).
Number of root classes (NOR).
Fan-in (FIN).
Number of children (NOC) and depth of the inheritance tree (DIT).
33
Metrics for Maintenance
IEEE Std 982.1-1998 Software Maturity Index (SMI)
SMI = [Mt - (Fa + Fc + Fd)]/Mt
Mt = # of modules in the current release
Fc = # of modules in the current release that have been changed
Fa = # of modules in the current release that have been added
Fd = # of modules from the preceding release that were deleted
in the current release
34
35
Design Structure Quality Index
(DSQI)
Developed by U.S. Air Force Systems Command
DSQI (ranging 0 to 1) is calculated from the following 7
values
S1 = the total # of modules define in the program architecture
S2 = the # of modules whose correct function depends on the
source of data input or that produce data to be used elsewhere
…
S7 = the # of modules with a single entry and exit
36