0% found this document useful (0 votes)
10 views

SE Module III Part-B

Uploaded by

mbucse a9
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views

SE Module III Part-B

Uploaded by

mbucse a9
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 23

Process and Project

Metrics
Measurement
• Provides a mechanism for objective evaluation
• Assists in
– Estimation
– Quality control
– Productivity assessment
– Project Control
– Tactical decision-making
• Acts as management tool

2
Metrics in the Process and Project
Domains
• Process metrics are collected across all projects and
over long periods of time
• Project metrics enable a software project manager to
– Assess the status of an ongoing project
– Track potential risks
– Uncover problem areas before they go “critical”
– Adjust work flow or tasks
– Evaluate the project team’s ability to control quality of
software work products

3
Process Metrics and Software Process
Improvement
Product

Customer Business
characteristics conditions

Process

People Development Technology


environment

Fig: 22.1 - Determinants for s/w quality and organizational effectiveness


4
• We measure the efficiency of a s/w process
indirectly, based on outcomes
• Probable outcomes are
– Measures of errors uncovered before release of the s/w
– Defects delivered to and reported by end-users
– Work products delivered (productivity)
– Human effort expended
– Calendar time expended
– Schedule conformance etc.
• There are “private and public” uses for different
types of process data

5
– Use common sense and organizational sensitivity when
interpreting metrics data
– Provide regular feedback to the individuals and teams who
collect measures and metrics
– Don’t use metrics to appraise individuals
– Work with practitioners and teams to set clear goals and
metrics that will be used to achieve them
– Never use metrics to threaten individuals or teams
– Metrics data that indicate a problem area should not be
considered “negative”. These data are merely an indicator
for process improvement
– Don’t obsess on a single metric to the exclusion of other
important metrics

6
Process Metrics and Software Process
Improvement
• Statistical Software Process Improvement (SSPI)
• Error
– Some flaw in a s/w engineering work product that is
uncovered before the s/w is delivered to the end-user
• Defect
– A flaw that is uncovered after delivery to the end-user

7
Project Metrics
• Used during estimation
• Used to monitor and control progress
• The intent is twofold
– Minimize the development schedule
– Assess product quality on an ongoing basis
• Leads to a reduction in overall project cost

8
Software Measurement
• S/W measurement can be categorized in
two ways:
1. Direct measures of the s/w process (e.g., cost
and effort applied) and product (e.g., lines of
code (LOC) produced, etc.)
2. Indirect measures of the product (e.g.,
functionality, quality, complexity, etc.)
• Requires normalization of both size- and
function-oriented metrics

9
Size-Oriented Metrics (1)
• Lines of Code (LOC) can be chosen as the
normalization value
• Example of simple size-oriented metrics
– Errors per KLOC (thousand lines of code)
– Defects per KLOC
– $ per KLOC
– Pages of documentation per KLOC

10
Size-Oriented Metrics (2)
• Controversy regarding use of LOC as a key measure
– According to the proponents
• LOC is an “artifact” of all s/w development projects
• Many existing s/w estimation models use LOC or KLOC as a key
input
– According to the opponents
• LOC measures are programming language dependent
• They penalize well-designed but shorter programs
• Cannot easily accommodate nonprocedural languages
• Difficult to predict during estimation

11
Function-Oriented Metrics (1)
• The most widely used function-oriented
metric is the function point (FP)
• Computation of the FP is based on
characteristics of the software’s information
domain and complexity

12
13
14
Function-Oriented Metrics (2)
• Controversy regarding use of FP as a key measure
– According to the proponents
• It is programming language independent
• Can be predicted before coding is started
– According to the opponents
• Based on subjective rather than objective data
• Has no direct physical meaning – it’s just a number

15
Object-Oriented Metrics
• Number of Scenario scripts
• Number of key classes
• Number of support classes
• Average number of support classes per key class
• Number of subsystems

16
Use-Case Oriented Metrics
• The use-case is independent of programming
language
• The no. of use-cases is directly proportional to the
size of the application in LOC and to the no. of test
cases
• There is no standard size for a use-case
• Its application as a normalizing measure is suspect

17
Web Engineering Project Metrics (1)
• Number of static Web pages
• Number of dynamic Web pages
• Number of internal page links
• Number of persistent data objects
• Number of external systems interfaced
• Number of static content objects
• Number of dynamic content objects
• Number of executable functions

18
Web Engineering Project Metrics (2)
• Let,
– Nsp = number of static Web pages
– Ndp = number of dynamic Web pages
• Then,
– Customization index, C = Ndp/(Ndp + Nsp)
• The value of C ranges from 0 to 1

19
Metrics for Software Quality
• Goals of s/w engineering
– Produce high-quality systems
– Meet deadlines
– Satisfy market need
• The primary thrust at the project level is to measure
errors and defects

20
Measuring Quality
• Correctness
– Defects per KLOC
• Maintainability
– Mean-time-to-change (MTTC)
• Integrity
– Threat and security
– integrity =  [1 – (threat  (1 - security))]
• Usability

21
Defect Removal Efficiency (DRE)
• Can be used at both the project and process level
• DRE = E / (E + D), [E = Error, D = Defect]
• Or, DREi = Ei / (Ei + Ei+1), [for ith activity]
• Try to achieve DREi that approaches 1

22
Integrating Metrics within the
Software Process
Software
engineering
process

Software Data Measures


project collection e.g. LOC, FP, NOP, Defects, Errors

Metrics
Metrics e.g. No. of FP, Size,
Software computation Error/KLOC, DRE
product
Metrics
Indicators
evaluation
e.g. Process
efficiency, Product
Fig: 22.3 - Software metrics collection process complexity, relative
23
overhead

You might also like