SlideShare a Scribd company logo
Software metrics and estimation
McGill ECSE 428
Software Engineering Practice
Radu Negulescu
Winter 2004
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 2
About this module
Measuring software is very subjective and approximate, but necessary to
answer key questions in running a software project:
• Planning: How much time/money needed?
• Monitoring: What is the current status?
• Control: How to decide closure?
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 3
Metrics
What to measure/estimate?
Product metrics
• Size: LOC, modules, etc.
• Scope/specification: function points
• Quality: defects, defects/LOC, P1-defects, etc.
• Lifecycle statistics: requirements, fixed defects, open issues, etc.
• ...
Project metrics
• Time
• Effort: person-months
• Cost
• Test cases
• Staff size
• ...
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 4
Basis for estimation
What data can be used as basis for estimation?
• Measures of size/scope
• Baseline data (from previous projects)
• Developer commitments
• Expert judgment
• “Industry standard” parameters
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 5
Uncertainty of estimation
Cone of uncertainty
• [McConnell Fig. 8-2]
• [McConnell Table 8-1]
Sources of uncertainty
• Product related
Requirements change
Type of application (system, shrinkwrap, client-server, real-time, ...)
• Staff related
Sick days, vacation time
Turnover
Individual abilities
Analysts, developers (10:1 differences)
Debugging (20:1 differences)
Team productivity (5:1 differences)
• Process related
Tool support (or lack thereof)
Process used
• …
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 6
Estimate-convergence graph
Initial
product
definition
Approved
product
definition
Requirements
specification
Product
design
specification
Detailed
design
specification
Product
complete
1.0×
0.25×
4×
2×
0.5×
1.5×
0.67×
1.25×
0.8×
1.0×
0.6×
1.6×
1.25×
0.8×
1.15×
0.85×
1.1×
0.9×
Project Cost
(effort and size)
Project
schedule
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 7
LOC metrics
LOC = lines of code
A measure of the size of a program
• Logical LOC vs. physical LOC
Not including comments and blank lines
Split lines count as one
• Rough approximation: #statements, semicolons
Advantages
• Easy to measure
• Easy to automate
• Objective
Disadvantage
• Easy to falsify
• Encourages counter-productive coding practices
• Implementation-biased
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 8
FP metrics
A measure of the scope of the program
• External inputs (EI)
Number of screens, forms, dialogues, controls or messages through which
an end user or another program adds deletes or changes data
• External outputs (EO)
Screens, reports, graphs or messages generated for use by end users or
other programs
• External inquiries (EQ)
Direct accesses to data in database
• Internal logical files (ILF)
Major groups of end user data, could be a “file” or “database table”
• External interface files (EIF)
Files controlled by other applications which the program interacts with
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 9
Examples
[Source: David Longstreet]
EI:
EO:
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 10
Examples
EQ:
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 11
Examples
ILF
EIF
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 12
FP metrics
Complexity weights
Low Med High
EI 3 4 6
EO 4 5 7
EQ 3 4 6
ILF 7 10 15
EIF 5 7 10
Influence multiplier: 0.65..1.35
• 14 factors
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 13
Counting function points
349.6Adjusted Function Point Total
1.15Influence Multiplier
304Unadjusted Function Point total
651027059External Interface Files
10015310275Logical Internal Files
32644230Inquiries
63705747Outputs
44634236Inputs
totalmultipliercountmultipliercountmultipliercountProgram Characteristic
High Complexity
Medium
ComplexityLow Complexity
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 14
Influence factors
Was the application designed for end-user
efficiency?
End-user efficiency7
What percentage of the information is
entered On-Line?
On-Line data entry6
How frequently are transactions executed
daily, weekly, monthly, etc.?
Transaction rate5
How heavily used is the current hardware
platform where the application will be
executed?
Heavily used configuration4
Did the user require response time or
throughput?
Performance3
How are distributed data and processing
functions handled?
Distributed data processing2
How many communication facilities are
there to aid in the transfer or exchange of
information with the application or system?
Data communications1
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 15
Influence factors
Was the application specifically designed,
developed, and supported to facilitate
change?
Facilitate change14
Was the application specifically designed,
developed, and supported to be installed
at multiple sites for multiple organizations?
Multiple sites13
How effective and/or automated are start-
up, back up, and recovery procedures?
Operational ease12
How difficult is conversion and
installation?
Installation ease11
Was the application developed to meet
one or many user’s needs?
Reusability10
Does the application have extensive
logical or mathematical processing?
Complex processing9
How many ILF’s are updated by On-Line
transaction?
On-Line update8
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 16
Influence score
Strong influence throughout5
Significant influence4
Average influence3
Moderate influence2
Incidental influence1
Not present, or no influence0
InfluenceScore
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 17
Influence score
INF = 0.65 + SCORE/100
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 18
FP metrics
Some advantages
• Based on specification (black-box)
• Technology independent
• Strong relationship to actual effort
• Encourages good development
Some disadvantages
• Needs extensive training
• Subjective
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 19
Jones’ “rules of thumb” estimates
Code volumes:
• Approx. 100 LOC/FP, varies widely
[Source: C. Jones “Estimating Software Costs” 1998]
• Schedule:
#calendar months = FP^0.4
• Development staffing:
#persons = FP/150 (average)
Raleigh curve
• Development effort:
#months * #persons = FP^1.4/150
McConnell
• Equation 8-1 “Software schedule equation”
#months = 3.0 * #man-months^(1/3)
• Table 8-9 “Efficient schedules”
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 20
Quality estimation
Typical tradeoff:
Adding a dimension: quality
• Early quality will actually reduce costs, time
• Late quality is traded against other parameters
product (scope)
cost (effort) schedule (time)
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 21
Quality estimation
Quality measure:
• Fault potential: # of defects introduced during development
• Defect rate: #defects in product
[Source: C. Jones “Estimating Software Costs” 1998]
• Test case volumes:
#test cases = FP^1.2
• Fault potential:
#faults = FP^1.25
• Testing fault removal: 30%/type of testing
85…99% total
• Inspection fault removal:
60..65%/inspection type
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 22
Other typical estimates
[Source: C. Jones “Estimating Software Costs” 1998]
• Maintenance staffing:
#persons = FP/750
• Post-release repair:
rate = 8 faults/PM
• Software plans and docs:
Page count = FP^1.15
• Creeping requirements:
Rate = 2%/month
0% … 5% / month, depending on method
• Costs per requirement:
$500/FP initial reqs
$1200/FP close to completion
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 23
Sample question
Consider a software project of 350 function points, assuming: the ratio of
calendar time vs. development time (development speed) is 2; testing
consists of unit, integration, and system testing; and new requirements
are added at a rate of 3% per month.
(a) Using the estimation rules of thumb discussed in class, give an
estimate for each of the following project parameters, assuming a
waterfall process.
(i) The total effort, expressed in person-months.
(ii) The total cost of the project.
(iii) The number of inspection steps required to obtain fewer than 175
defects.
(b) Re-do the estimates in part (a) assuming that the project can be
split into two nearly independent parts of 200 function points and 150
function points, respectively.
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 24
Lifecycle statistics
Life cycle of a project item
• Well represented by a state machine
• E.g. a “bug” life cycle
Simplest form: 3 states
May reach 10s of states when bug prioritization is involved
• Statistics on bugs, requirements, “issues”, tasks, etc.
Open Fixed Closed
DEV
QA
QA
QA
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 25
Estimation process
Perceived
Actual
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 26
Example procedure
What do you think of the following procedure:
[Source: Schneider,Winters - “Applying Use Cases” Addison-Wesley, 1999]
Starting point: use cases.
UUCP: unadjusted use case points
• ~ # of analysis classes: 5, 10, 15
TCF: technical complexity factor
• 0.6 + sum(0.01 * TFactor)
• TFactor sum range: 14
EF: experience factor
• 1.4 + sum(-0.03 * EFactor)
• Efactor sum range: 4.5
UCP: use case points
• UUCP * TCF * EF
PH: person-hours
• UCP * (20..28) + 120
McGill University ECSE 428 © 2004 Radu Negulescu
Software Engineering Practice Software metrics—Slide 27
Estimation tips
Adapted from [McConnell].
Avoid tentative estimates.
• Allow time for the estimation activity.
Use baselined data.
Use developer-based estimates.
• Estimate by walkthrough.
Estimate by categories.
Estimate at a low level of detail.
Use estimation tools.
Use several different estimation techniques.
• Change estimation practices during a project.

More Related Content

What's hot (20)

PPT
Metrics
geethawilliam
 
PPTX
Software design metrics
Prasad Narasimhan
 
PPT
Software process and project metrics
Indu Sharma Bhardwaj
 
PDF
Software Metrics
Massimo Felici
 
PDF
Chapter 6 software metrics
despicable me
 
PPTX
14 software technical_metrics
University of Computer Science and Technology
 
PPT
Software Metrics
swatisinghal
 
PPTX
Product metrics
Amey Phutane
 
PPTX
Software engineering 13 software product metrics
Vaibhav Khanna
 
PPTX
Software matrics and measurement
Gurpreet Saini
 
PDF
Software metrics
Matthias Mullie
 
PPTX
Estimation techniques and software metrics
Mae Abigail Banquil
 
PDF
Software metrics
Sivaraam Duraisamy
 
PPTX
Metrics for project size estimation
Nur Islam
 
PDF
Software metrics by Dr. B. J. Mohite
Zeal Education Society, Pune
 
PPTX
SOFTWARE MEASUREMENT ESTABLISHING A SOFTWARE MEASUREMENT PROCESS
Amin Bandeali
 
PPT
Chapter 15 software product metrics
SHREEHARI WADAWADAGI
 
PDF
Chap13
vancouverboy2011
 
Metrics
geethawilliam
 
Software design metrics
Prasad Narasimhan
 
Software process and project metrics
Indu Sharma Bhardwaj
 
Software Metrics
Massimo Felici
 
Chapter 6 software metrics
despicable me
 
14 software technical_metrics
University of Computer Science and Technology
 
Software Metrics
swatisinghal
 
Product metrics
Amey Phutane
 
Software engineering 13 software product metrics
Vaibhav Khanna
 
Software matrics and measurement
Gurpreet Saini
 
Software metrics
Matthias Mullie
 
Estimation techniques and software metrics
Mae Abigail Banquil
 
Software metrics
Sivaraam Duraisamy
 
Metrics for project size estimation
Nur Islam
 
Software metrics by Dr. B. J. Mohite
Zeal Education Society, Pune
 
SOFTWARE MEASUREMENT ESTABLISHING A SOFTWARE MEASUREMENT PROCESS
Amin Bandeali
 
Chapter 15 software product metrics
SHREEHARI WADAWADAGI
 

Viewers also liked (19)

PPT
Sw Software Metrics
jonathan077070
 
PPT
Software Metrics
gh0sst
 
PPTX
Software metrics
Ione Donosa
 
PDF
A functional software measurement approach bridging the gap between problem a...
IWSM Mensura
 
PPTX
s/w metrics monitoring and control
Priyanka Pradhan
 
PDF
Software quality metric
Luthfia Ulinnuha
 
PPT
Software metrics
Dr. C.V. Suresh Babu
 
PPT
12 couplingand cohesion-student
randhirlpu
 
PPTX
Software quality metrics methodology _tanmi kiran
Tanmi Kapoor
 
PDF
Understanding software metrics
Tushar Sharma
 
PPT
Software Engineering Fundamentals
Rahul Sudame
 
PDF
Software Measurement: Lecture 2. Function Point Analysis
Programeter
 
DOC
SOFTWARE MEASUREMENT A PROCESS MODEL
Amin Bandeali
 
PDF
Importance of software quality metrics
Piyush Sohaney
 
PDF
Unit II Software Testing and Quality Assurance
VinothkumaR Ramu
 
PDF
Software Testing - Defect Metrics & Analysis
OAK Systems Pvt Ltd
 
PPTX
Software Test Metrics and Measurements
Davis Thomas
 
ODP
Software Measurement: Lecture 1. Measures and Metrics
Programeter
 
PPTX
Cohesion & Coupling
Jagnesh Chawla
 
Sw Software Metrics
jonathan077070
 
Software Metrics
gh0sst
 
Software metrics
Ione Donosa
 
A functional software measurement approach bridging the gap between problem a...
IWSM Mensura
 
s/w metrics monitoring and control
Priyanka Pradhan
 
Software quality metric
Luthfia Ulinnuha
 
Software metrics
Dr. C.V. Suresh Babu
 
12 couplingand cohesion-student
randhirlpu
 
Software quality metrics methodology _tanmi kiran
Tanmi Kapoor
 
Understanding software metrics
Tushar Sharma
 
Software Engineering Fundamentals
Rahul Sudame
 
Software Measurement: Lecture 2. Function Point Analysis
Programeter
 
SOFTWARE MEASUREMENT A PROCESS MODEL
Amin Bandeali
 
Importance of software quality metrics
Piyush Sohaney
 
Unit II Software Testing and Quality Assurance
VinothkumaR Ramu
 
Software Testing - Defect Metrics & Analysis
OAK Systems Pvt Ltd
 
Software Test Metrics and Measurements
Davis Thomas
 
Software Measurement: Lecture 1. Measures and Metrics
Programeter
 
Cohesion & Coupling
Jagnesh Chawla
 
Ad

Similar to Software Engineering Practice - Software Metrics and Estimation (20)

PDF
Product Metrics in the course on Software Engineering
mkrishnamurty1
 
PPT
Hard work matters for everyone in everytbing
lojob95766
 
PDF
IJSRED-V2I4P8
IJSRED
 
PDF
Intro to Software Engineering - Software Testing
Radu_Negulescu
 
PPT
Chapter 11 Metrics for process and projects.ppt
ssuser3f82c9
 
PPT
Ch15-22-23 (1).ppt
B86RanePranavMurari
 
PPTX
Software Measurement and Metrics.pptx
ubaidullah75790
 
PPTX
Cost estimation techniques
lokareminakshi
 
PDF
Software Engineering Practice - Software Quality Management
Radu_Negulescu
 
PPT
Metrics
Naveen B-WI
 
PPT
Software Theory product metrics 2024R.ppt
ANKITKUMAR625961
 
PPTX
Software Engineering Chapter 4 Part 1 Euu
CryptoMaster7
 
PPT
20_Metricsresearchmolodology Metricsresearchmolodology Metricsresearchmolodol...
JishnuS30
 
PPT
Chapter_25.ppt
PranavHirulkar1
 
PPTX
Software_Engineering_Metrics_and_Project_Management.pptx
ParthTyagi46
 
PDF
Hindu guid3 hwige owhop euueye uwowiei huideeh hwiw
ssuser2cde60
 
PPT
Managing software project, software engineering
Rupesh Vaishnav
 
PDF
Intro to Software Engineering - Life Cycle Models
Radu_Negulescu
 
PDF
130924 yann-gael gueheneuc - an overview of software code quality and conne...
Ptidej Team
 
PPTX
software engineering module i & ii.pptx
rani marri
 
Product Metrics in the course on Software Engineering
mkrishnamurty1
 
Hard work matters for everyone in everytbing
lojob95766
 
IJSRED-V2I4P8
IJSRED
 
Intro to Software Engineering - Software Testing
Radu_Negulescu
 
Chapter 11 Metrics for process and projects.ppt
ssuser3f82c9
 
Ch15-22-23 (1).ppt
B86RanePranavMurari
 
Software Measurement and Metrics.pptx
ubaidullah75790
 
Cost estimation techniques
lokareminakshi
 
Software Engineering Practice - Software Quality Management
Radu_Negulescu
 
Metrics
Naveen B-WI
 
Software Theory product metrics 2024R.ppt
ANKITKUMAR625961
 
Software Engineering Chapter 4 Part 1 Euu
CryptoMaster7
 
20_Metricsresearchmolodology Metricsresearchmolodology Metricsresearchmolodol...
JishnuS30
 
Chapter_25.ppt
PranavHirulkar1
 
Software_Engineering_Metrics_and_Project_Management.pptx
ParthTyagi46
 
Hindu guid3 hwige owhop euueye uwowiei huideeh hwiw
ssuser2cde60
 
Managing software project, software engineering
Rupesh Vaishnav
 
Intro to Software Engineering - Life Cycle Models
Radu_Negulescu
 
130924 yann-gael gueheneuc - an overview of software code quality and conne...
Ptidej Team
 
software engineering module i & ii.pptx
rani marri
 
Ad

More from Radu_Negulescu (14)

PDF
Intro to Software Engineering - Software Quality Assurance
Radu_Negulescu
 
PDF
Final Exam Solutions Fall02
Radu_Negulescu
 
PDF
Final Exam Questions Fall03
Radu_Negulescu
 
PDF
Midterm Exam Solutions Fall03
Radu_Negulescu
 
PDF
Midterm Exam Solutions Fall02
Radu_Negulescu
 
PDF
Intro to Software Engineering - Software Quality Assurance
Radu_Negulescu
 
PDF
Intro to Software Engineering - Software Design
Radu_Negulescu
 
PDF
Intro to Software Engineering - Module Design
Radu_Negulescu
 
PDF
Intro to Software Engineering - Requirements Analysis
Radu_Negulescu
 
PDF
Intro to Software Engineering - Coding Standards
Radu_Negulescu
 
PDF
Software Engineering Practice - Software Business Basics
Radu_Negulescu
 
PDF
Software Engineering Practice - Project management
Radu_Negulescu
 
PDF
Software Engineering Practice - Configuration management
Radu_Negulescu
 
PDF
Software Engineering Practice - Advanced Development Methodologies
Radu_Negulescu
 
Intro to Software Engineering - Software Quality Assurance
Radu_Negulescu
 
Final Exam Solutions Fall02
Radu_Negulescu
 
Final Exam Questions Fall03
Radu_Negulescu
 
Midterm Exam Solutions Fall03
Radu_Negulescu
 
Midterm Exam Solutions Fall02
Radu_Negulescu
 
Intro to Software Engineering - Software Quality Assurance
Radu_Negulescu
 
Intro to Software Engineering - Software Design
Radu_Negulescu
 
Intro to Software Engineering - Module Design
Radu_Negulescu
 
Intro to Software Engineering - Requirements Analysis
Radu_Negulescu
 
Intro to Software Engineering - Coding Standards
Radu_Negulescu
 
Software Engineering Practice - Software Business Basics
Radu_Negulescu
 
Software Engineering Practice - Project management
Radu_Negulescu
 
Software Engineering Practice - Configuration management
Radu_Negulescu
 
Software Engineering Practice - Advanced Development Methodologies
Radu_Negulescu
 

Recently uploaded (20)

PDF
Peak of Data & AI Encore AI-Enhanced Workflows for the Real World
Safe Software
 
PPTX
MARTSIA: A Tool for Confidential Data Exchange via Public Blockchain - Pitch ...
Michele Kryston
 
PDF
Kubernetes - Architecture & Components.pdf
geethak285
 
PDF
How to Comply With Saudi Arabia’s National Cybersecurity Regulations.pdf
Bluechip Advanced Technologies
 
PPTX
MARTSIA: A Tool for Confidential Data Exchange via Public Blockchain - Poster...
Michele Kryston
 
PPTX
Agentforce World Tour Toronto '25 - MCP with MuleSoft
Alexandra N. Martinez
 
PDF
TrustArc Webinar - Navigating APAC Data Privacy Laws: Compliance & Challenges
TrustArc
 
PPTX
Wondershare Filmora Crack Free Download 2025
josanj305
 
PDF
5 Things to Consider When Deploying AI in Your Enterprise
Safe Software
 
PDF
Next Generation AI: Anticipatory Intelligence, Forecasting Inflection Points ...
dleka294658677
 
PPTX
MuleSoft MCP Support (Model Context Protocol) and Use Case Demo
shyamraj55
 
PDF
🚀 Let’s Build Our First Slack Workflow! 🔧.pdf
SanjeetMishra29
 
PDF
Introducing and Operating FME Flow for Kubernetes in a Large Enterprise: Expe...
Safe Software
 
PDF
Book industry state of the nation 2025 - Tech Forum 2025
BookNet Canada
 
PDF
Dev Dives: Accelerating agentic automation with Autopilot for Everyone
UiPathCommunity
 
PDF
Governing Geospatial Data at Scale: Optimizing ArcGIS Online with FME in Envi...
Safe Software
 
PDF
NLJUG Speaker academy 2025 - first session
Bert Jan Schrijver
 
PDF
Darley - FIRST Copenhagen Lightning Talk (2025-06-26) Epochalypse 2038 - Time...
treyka
 
PPTX
2025 HackRedCon Cyber Career Paths.pptx Scott Stanton
Scott Stanton
 
PDF
Quantum Threats Are Closer Than You Think – Act Now to Stay Secure
WSO2
 
Peak of Data & AI Encore AI-Enhanced Workflows for the Real World
Safe Software
 
MARTSIA: A Tool for Confidential Data Exchange via Public Blockchain - Pitch ...
Michele Kryston
 
Kubernetes - Architecture & Components.pdf
geethak285
 
How to Comply With Saudi Arabia’s National Cybersecurity Regulations.pdf
Bluechip Advanced Technologies
 
MARTSIA: A Tool for Confidential Data Exchange via Public Blockchain - Poster...
Michele Kryston
 
Agentforce World Tour Toronto '25 - MCP with MuleSoft
Alexandra N. Martinez
 
TrustArc Webinar - Navigating APAC Data Privacy Laws: Compliance & Challenges
TrustArc
 
Wondershare Filmora Crack Free Download 2025
josanj305
 
5 Things to Consider When Deploying AI in Your Enterprise
Safe Software
 
Next Generation AI: Anticipatory Intelligence, Forecasting Inflection Points ...
dleka294658677
 
MuleSoft MCP Support (Model Context Protocol) and Use Case Demo
shyamraj55
 
🚀 Let’s Build Our First Slack Workflow! 🔧.pdf
SanjeetMishra29
 
Introducing and Operating FME Flow for Kubernetes in a Large Enterprise: Expe...
Safe Software
 
Book industry state of the nation 2025 - Tech Forum 2025
BookNet Canada
 
Dev Dives: Accelerating agentic automation with Autopilot for Everyone
UiPathCommunity
 
Governing Geospatial Data at Scale: Optimizing ArcGIS Online with FME in Envi...
Safe Software
 
NLJUG Speaker academy 2025 - first session
Bert Jan Schrijver
 
Darley - FIRST Copenhagen Lightning Talk (2025-06-26) Epochalypse 2038 - Time...
treyka
 
2025 HackRedCon Cyber Career Paths.pptx Scott Stanton
Scott Stanton
 
Quantum Threats Are Closer Than You Think – Act Now to Stay Secure
WSO2
 

Software Engineering Practice - Software Metrics and Estimation

  • 1. Software metrics and estimation McGill ECSE 428 Software Engineering Practice Radu Negulescu Winter 2004
  • 2. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 2 About this module Measuring software is very subjective and approximate, but necessary to answer key questions in running a software project: • Planning: How much time/money needed? • Monitoring: What is the current status? • Control: How to decide closure?
  • 3. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 3 Metrics What to measure/estimate? Product metrics • Size: LOC, modules, etc. • Scope/specification: function points • Quality: defects, defects/LOC, P1-defects, etc. • Lifecycle statistics: requirements, fixed defects, open issues, etc. • ... Project metrics • Time • Effort: person-months • Cost • Test cases • Staff size • ...
  • 4. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 4 Basis for estimation What data can be used as basis for estimation? • Measures of size/scope • Baseline data (from previous projects) • Developer commitments • Expert judgment • “Industry standard” parameters
  • 5. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 5 Uncertainty of estimation Cone of uncertainty • [McConnell Fig. 8-2] • [McConnell Table 8-1] Sources of uncertainty • Product related Requirements change Type of application (system, shrinkwrap, client-server, real-time, ...) • Staff related Sick days, vacation time Turnover Individual abilities Analysts, developers (10:1 differences) Debugging (20:1 differences) Team productivity (5:1 differences) • Process related Tool support (or lack thereof) Process used • …
  • 6. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 6 Estimate-convergence graph Initial product definition Approved product definition Requirements specification Product design specification Detailed design specification Product complete 1.0× 0.25× 4× 2× 0.5× 1.5× 0.67× 1.25× 0.8× 1.0× 0.6× 1.6× 1.25× 0.8× 1.15× 0.85× 1.1× 0.9× Project Cost (effort and size) Project schedule
  • 7. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 7 LOC metrics LOC = lines of code A measure of the size of a program • Logical LOC vs. physical LOC Not including comments and blank lines Split lines count as one • Rough approximation: #statements, semicolons Advantages • Easy to measure • Easy to automate • Objective Disadvantage • Easy to falsify • Encourages counter-productive coding practices • Implementation-biased
  • 8. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 8 FP metrics A measure of the scope of the program • External inputs (EI) Number of screens, forms, dialogues, controls or messages through which an end user or another program adds deletes or changes data • External outputs (EO) Screens, reports, graphs or messages generated for use by end users or other programs • External inquiries (EQ) Direct accesses to data in database • Internal logical files (ILF) Major groups of end user data, could be a “file” or “database table” • External interface files (EIF) Files controlled by other applications which the program interacts with
  • 9. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 9 Examples [Source: David Longstreet] EI: EO:
  • 10. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 10 Examples EQ:
  • 11. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 11 Examples ILF EIF
  • 12. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 12 FP metrics Complexity weights Low Med High EI 3 4 6 EO 4 5 7 EQ 3 4 6 ILF 7 10 15 EIF 5 7 10 Influence multiplier: 0.65..1.35 • 14 factors
  • 13. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 13 Counting function points 349.6Adjusted Function Point Total 1.15Influence Multiplier 304Unadjusted Function Point total 651027059External Interface Files 10015310275Logical Internal Files 32644230Inquiries 63705747Outputs 44634236Inputs totalmultipliercountmultipliercountmultipliercountProgram Characteristic High Complexity Medium ComplexityLow Complexity
  • 14. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 14 Influence factors Was the application designed for end-user efficiency? End-user efficiency7 What percentage of the information is entered On-Line? On-Line data entry6 How frequently are transactions executed daily, weekly, monthly, etc.? Transaction rate5 How heavily used is the current hardware platform where the application will be executed? Heavily used configuration4 Did the user require response time or throughput? Performance3 How are distributed data and processing functions handled? Distributed data processing2 How many communication facilities are there to aid in the transfer or exchange of information with the application or system? Data communications1
  • 15. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 15 Influence factors Was the application specifically designed, developed, and supported to facilitate change? Facilitate change14 Was the application specifically designed, developed, and supported to be installed at multiple sites for multiple organizations? Multiple sites13 How effective and/or automated are start- up, back up, and recovery procedures? Operational ease12 How difficult is conversion and installation? Installation ease11 Was the application developed to meet one or many user’s needs? Reusability10 Does the application have extensive logical or mathematical processing? Complex processing9 How many ILF’s are updated by On-Line transaction? On-Line update8
  • 16. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 16 Influence score Strong influence throughout5 Significant influence4 Average influence3 Moderate influence2 Incidental influence1 Not present, or no influence0 InfluenceScore
  • 17. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 17 Influence score INF = 0.65 + SCORE/100
  • 18. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 18 FP metrics Some advantages • Based on specification (black-box) • Technology independent • Strong relationship to actual effort • Encourages good development Some disadvantages • Needs extensive training • Subjective
  • 19. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 19 Jones’ “rules of thumb” estimates Code volumes: • Approx. 100 LOC/FP, varies widely [Source: C. Jones “Estimating Software Costs” 1998] • Schedule: #calendar months = FP^0.4 • Development staffing: #persons = FP/150 (average) Raleigh curve • Development effort: #months * #persons = FP^1.4/150 McConnell • Equation 8-1 “Software schedule equation” #months = 3.0 * #man-months^(1/3) • Table 8-9 “Efficient schedules”
  • 20. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 20 Quality estimation Typical tradeoff: Adding a dimension: quality • Early quality will actually reduce costs, time • Late quality is traded against other parameters product (scope) cost (effort) schedule (time)
  • 21. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 21 Quality estimation Quality measure: • Fault potential: # of defects introduced during development • Defect rate: #defects in product [Source: C. Jones “Estimating Software Costs” 1998] • Test case volumes: #test cases = FP^1.2 • Fault potential: #faults = FP^1.25 • Testing fault removal: 30%/type of testing 85…99% total • Inspection fault removal: 60..65%/inspection type
  • 22. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 22 Other typical estimates [Source: C. Jones “Estimating Software Costs” 1998] • Maintenance staffing: #persons = FP/750 • Post-release repair: rate = 8 faults/PM • Software plans and docs: Page count = FP^1.15 • Creeping requirements: Rate = 2%/month 0% … 5% / month, depending on method • Costs per requirement: $500/FP initial reqs $1200/FP close to completion
  • 23. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 23 Sample question Consider a software project of 350 function points, assuming: the ratio of calendar time vs. development time (development speed) is 2; testing consists of unit, integration, and system testing; and new requirements are added at a rate of 3% per month. (a) Using the estimation rules of thumb discussed in class, give an estimate for each of the following project parameters, assuming a waterfall process. (i) The total effort, expressed in person-months. (ii) The total cost of the project. (iii) The number of inspection steps required to obtain fewer than 175 defects. (b) Re-do the estimates in part (a) assuming that the project can be split into two nearly independent parts of 200 function points and 150 function points, respectively.
  • 24. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 24 Lifecycle statistics Life cycle of a project item • Well represented by a state machine • E.g. a “bug” life cycle Simplest form: 3 states May reach 10s of states when bug prioritization is involved • Statistics on bugs, requirements, “issues”, tasks, etc. Open Fixed Closed DEV QA QA QA
  • 25. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 25 Estimation process Perceived Actual
  • 26. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 26 Example procedure What do you think of the following procedure: [Source: Schneider,Winters - “Applying Use Cases” Addison-Wesley, 1999] Starting point: use cases. UUCP: unadjusted use case points • ~ # of analysis classes: 5, 10, 15 TCF: technical complexity factor • 0.6 + sum(0.01 * TFactor) • TFactor sum range: 14 EF: experience factor • 1.4 + sum(-0.03 * EFactor) • Efactor sum range: 4.5 UCP: use case points • UUCP * TCF * EF PH: person-hours • UCP * (20..28) + 120
  • 27. McGill University ECSE 428 © 2004 Radu Negulescu Software Engineering Practice Software metrics—Slide 27 Estimation tips Adapted from [McConnell]. Avoid tentative estimates. • Allow time for the estimation activity. Use baselined data. Use developer-based estimates. • Estimate by walkthrough. Estimate by categories. Estimate at a low level of detail. Use estimation tools. Use several different estimation techniques. • Change estimation practices during a project.