0% found this document useful (0 votes)
16 views

Project Estimation TechniquesPPT

Uploaded by

cgzxkf7bhq
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views

Project Estimation TechniquesPPT

Uploaded by

cgzxkf7bhq
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 34

Project Estimation

Techniques
The seeds of major software disasters are usually sown in the first three months of
commencing the software project. Hasty scheduling, irrational commitments, unprofessional
estimating techniques, and carelessness of the project management function are the factors
that tend to introduce terminal problems. Once a project blindly lurches forward toward an
impossibl e delivery date, the rest of the disaster will occur almost inevitably
Estimation techniques are:
i. guesstimating
ii. Delphi
iii. top-down and
iv. bottom-up estimating.
GUESTIMATING TECHNIQUE

Estimation by guessing or just picking numbers


out of the air is not the best way to derive a
project's schedule and budget.

Unfortunately, many inexperienced project


managers tend to guesstimate, or guess at the
estimates, because it is quick and easy. For
example, we might guesstimate that testing will
take two weeks. Why two weeks? Why not three
weeks? Or ten weeks?
Underestimating can result in long hours, reduced quality, and unmet client
expectations. If you ever find yourself being pressured to guesstimate, your first
impulse should be to stall until you have enough information to make a confident
estimate.
Delphi Techniques

The Delphi technique involves multiple experts who


arrive at a consensus on a particular subject or
issue. Although the Delphi technique is generally
used for group decision-making, it can be a useful
tool for estimating when the time and money
warrant the extra effort.
Using Delphi technique can take longer and cost
more than most estimation methods, but it can be
very effective and provide reasonable assurance
when the stakes are high and the margin for error is
low
Time Boxing
Time boxing is a technique whereby a box of time is
allocated for a specific activity or task.

This allocation is based more on a requirement rather


than on just guesswork. For example, a project team may
have two (and only two) weeks to build a prototype. At
the end of the two weeks, work on the prototype stops,
regardless of whether the prototype is 100 percent
complete.

Used effectively, time boxing can help focus the project


team's effort on an important and critical task. The
schedule pressure to meet a particular deadline, however,
may result in long hours and pressure to succeed. Used
inappropriately or too often, the project team members
TOP- DOWN ESTIMATE

Top-down estimating involves estimating the schedule and/or cost


of the entire project in terms of how long it should take or how
much it should cost.

Top-down estimating is a very common occurrence that often


results from a mandate made by upper management (e.g., Thou
shalt complete the project within six months and spend no more
than 500,000!).

Often the schedule and/or cost estimate is a product of some


strategic plan or because someone thinks it should take a certain
amount of time or cost a particular amount. On the other hand,
top-down estimating could be a reaction to the business .
Bottom-Up
Estimating
Bottom-up estimating involves dividing the project
into smaller modules and then directly estimating the
time and effort in terms of person-hours, person-
weeks, or person-months for each module.

The work breakdown structure provides the basis for


bottom-up estimating because all of the project
phases and activities are defined.

The project manager, or the project team, can provide


reasonable time estimates for each activity. In short,
bottom-up estimating starts with a list of all required
tasks or activities and then an estimate for the
amount of effort is made
Software Engineering Metric Approach

Software engineering focuses on the processes,


tools, and methods for developing a quality approach
to developing software (Pressman 2001).

Metrics on the other hand, provide the basis for


software engineering and refers to a broad range of
measurements for objectively evaluating computer
software.

The greatest challenge for estimating an IT project is


estimating the time and effort for the largest
deliverable of the project—the application system.
Lines of Code (LOC)
Counting the number of lines of code in computer programs is the
most traditional and widely used software metric for sizing the
application product.

It is also the most controversial. Although counting lines of code


seems intuitively obvious—a 1,000 LOC Java program will be ten
times larger than a 100 LOC Java program—counting LOC is not all
that straightforward.

First, what counts as LOC? Do we include comments? Maybe we


should not because a programmer could artificially boost his or
her productivity by writing one hundred comment lines for every
line of code that actually did something.

On the other hand, comments are important because they tell us


what the code should be doing. This makes it easier to debug and
for others to understand what sections of code in the program are
Function Points

The inherent problems of LOC as a metric for


estimation and productivity necessitated the need
for a better software metric.

Function points are a synthetic metric, similar to


ones used every day, such as hours, kilos, tons,
nautical miles, degrees Celsius, and so on.

However, function points focus on the functionality


and complexity of an application system or a
particular module.

The good thing about function points is that they are


independent of the technology.
i. FPA measures the size of the
solution instead of the size of the
problem
ii. It helps in estimating overall
project costs, schedule, and effort
iii. It is independent of
technology/programming language
iv. It helps in the estimation of testing
projects
Function Point Analysis Basics

External Inputs (EI): This is the process of capturing inputs from


users like control information or business information and store it as
internal/external logic database files.

External Outputs (EO): This is the process of sending out data to


external users or systems. The data might be directly grabbed from
database files or might undergo some system-level processing.

Inquiries (EQ): This process includes both input and output


components. The data is then processed to extract relevant information
from internal/external database files.

Internal Logic File (ILF): This is the set of data present within the
system. The majority of the data will be interrelated and are captured
via the inputs received from the external sources.

External Logic File (ELF): This is the set of data from external
resources or external applications. The majority of the data from the
external resource is used by the system for reference purposes.
Function Point Analysis
 Developed by Allen Albrecht, IBM Research, 1979
 Technique to determine size of software projects
 Size is measured from a functional point of view
 Estimates are based on functional requirements
 Albrecht originally used the technique to predict effort
 Size is usually the primary driver of development effort
 Independent of
 Implementation language and technology
 Development methodology
 Capability of the project team
 A top-down approach based on function types
 Three steps: Plan the count, perform the count, estimate the effort.
Steps in Function Point Analysis
 Plan the count
 Type of count: development, enhancement, application
 Identify the counting boundary
 Identify sources for counting information: software, documentation
and/or expert
 Perform the count
 Count data access functions
 Count transaction functions
 Estimate the effort
 Compute the unadjusted function points (UFP)
 Compute the Value Adjusted Factor (VAF)
 Compute the adjusted Function Points (AFP)
 Compute the performance factor
 Calculate the effort in person days
Function Types
Data function types
# of internal logical files (ILF)
# of external interface files (EIF)
Transaction function types
# of external input (EI)
# of external output (EO)
# of external queries (EQ)
Calculate the UFP (unadjusted function points):
UFP = a · EI + b · EO + c · EQ + d · ILF + e · EIF

a-f are weight factors


Mapping Functions to Transaction Types

Add Customer
Change Customer
Delete Customer
External Inputs
Receive payment
Deposit Item
Retrieve Item
Add Place
Change Place Data
Delete Place
Print Customer item list
Print Bill External Outputs
Print Item List
Query Customer
Query Customer's items
Query Places
Query Stored Items
External Inquiries
The data functions represent the functionality provided to the user by
attending to their internal and external requirements in relation to the
data, whereas the
transactional functions describe the functionality provided to the user in
relation to the processing this data by the application.
Evaluation of Value Adjusted FP Value Adjustment Factor (VAF) is derived
from the sum of the degree of influence (DI) of the 14 general system
characteristics (GSCc). General System characteristics are:
1. Data communications
2. Distributed data processing
3. Performance
4. Heavily utilised configuration
5. Transaction rate
6. On-line data entry
7. End-user efficiency
8. On-line update
9. Complex processing
10. Reusability
11. Installations ease
12. Operational ease
13. Multiple sites/organisations
14. Facilitate change

Function points can be converted to Effort in Person Hours. Numbers of


Calculate the Unadjusted Function
Points
Weight Factors
Function Type Numbe simple average
complex
r
External Input (EI) x 3 4 6 =

External Output (EO) x 4 5 7 =

External Queries (EQ) x 3 4 6 =

Internal Datasets (ILF) x 7 10 15 =

Interfaces (EIF) x 5 7 10 =

Unadjusted Function Points (UFP)


=
The data functions represent the functionality provided to the user by
attending to their internal and external requirements in relation to the
data, whereas the
transactional functions describe the functionality provided to the user in
relation to the processing this data by the application.
Evaluation of Value Adjusted FP Value Adjustment Factor (VAF) is derived
from the sum of the degree of influence (DI) of the 14 general system
characteristics (GSCc). General System characteristics are:
1. Data communications
2. Distributed data processing
3. Performance
4. Heavily utilised configuration
5. Transaction rate
6. On-line data entry
7. End-user efficiency
8. On-line update
9. Complex processing
10. Reusability
11. Installations ease
12. Operational ease
13. Multiple sites/organisations
14. Facilitate change

Function points can be converted to Effort in Person Hours. Numbers of


14 General System Complexity
FactorsThe unadjusted function points are adjusted with general system

complexity (GSC) factors

GSC1: Reliable Backup & Recovery GSC8: On-line data change


GSC2: Use of Data Communication GSC9: Complex user interface
GSC3: Use of Distributed Computing GSC10: Complex procedures
GSC4: Performance GSC11: Reuse
GSC5: Realization in heavily used configuration GSC12: Ease of installation
GSC6: On-line data entry GSC13: Use at multiple sites
GSC7: User Friendliness GSC14: Adaptability and flexibility

• Each of the GSC factors gets a value from 0 to 5.


Calculate the Effort
 After the GSC factors are determined, compute the Value Added Factor
(VAF):
14
VAF = 0.65 + 0.01 * S GSCi=1
i
GSCi = 0,1,...,5

 Function Points =
Unadjusted Function Points * Value Added Factor
 FP = UFP · VAF
 Performance factor
 PF = Number of function points that can be completed per day
 Effort = FP / PF
Examples

 UFP = 18
 Sum of GSC factors = 0.22
 VAF = 0.87
 Adjusted FP = VAF * UFP = 0.87 * 18 ~ 16
 PF =2
 Effort = 16/2 = 8 person days

 UFP = 18
 Sum of GSC factors = .70
 VAF = 1.35
 Adjusted FP = VAF * UFP = 1.35 * 18 ~ 25
 PF = 1
 Effort = 25/1 = 25 person days
Advantages of Function Point

Independent of implementation language and technology
AnalysisEstimates are based on design specification

 Usually known before implementation tasks are known


 Users without technical knowledge can be integrated into the
estimation process
 Incorporation of experiences from different organizations
 Easy to learn
 Limited time effort
Disadvantages of Function Point
Analysis
 Complete description of functions necessary
 Often not the case in early project stages -> especially in iterative
software processes
 Only complexity of specification is estimated
 Implementation is often more relevant for estimation
 High uncertainty in calculating function points:
 Weight factors are usually deducted from past experiences (environment,
used technology and tools may be out-of-date in the current project)
 Does not measure the performance of people
COCOMO (COnstructive COst
MOdel)
 Developed by Barry Boehm in 1981
 Also called COCOMO I or Basic COCOMO
 Top-down approach to estimate cost, effort and schedule of software
projects, based on size and complexity of projects
 Assumptions:
 Derivability of effort by comparing finished projects (“COCOMO database”)
 System requirements do not change during development
 Exclusion of some efforts (for example administration, training, rollout,
integration).
Calculation of Effort

 Estimate number of instructions


 KDSI = “Kilo Delivered Source Instructions”
 Determine project complexity parameters: A, B
 Regression analysis, matching project data to equation
 3 levels of difficulty that characterize projects
 Simple project (“organic mode”)
 Semi-complex project (“semidetached mode”)
 Complex project (“embedded mode”)
 Calculate effort
 B
Effort = A * KDSI
 Also called Basic COCOMO
Calculation of Effort in Basic
COCOMO Formula: Effort = A * KDSI
B

 Effortis counted in person months: 152


productive hours (8 hours per day, 19
days/month, less weekends, holidays, etc.)
 A, B are constants based on the complexity of
the project

Project Complexity A B
Simple 2.4 1.05
Semi-Complex 3.0 1.12
Complex 3.6 1.20
Calculation of Development Time
D
Basic formula: T = C * Effort
T = Time to develop in months
 C, D = constants based on the complexity of
the project
 Effort= Effort in person months (see slide
before)

Project Complexity C D
Simple 2.5 0.38
Semi-Complex 2.5 0.35
Complex 2.5 0.32
Basic COCOMO Example
Volume = 30000 LOC = 30KLOC
Project type = Simple
Effort = 2.4 * (30)1.05 = 85 PM
Development Time = 2.5 * (85)0.38 = 13.5 months

=> Avg. staffing: 85/13.5 = 6.3 persons


=> Avg. productivity: 30000/85 = 353 LOC/PM

Compare: Semi-detached: 135 PM 13.9 M


9.7 persons
Embedded: 213 PM
13.9 M 15.3 persons
Other COCOMO Models

 Intermediate COCOMO
 15 cost drivers yielding a multiplicative correction factor
 Basic COCOMO is based on value of 1.00 for each of the cost drivers
 Detailed COCOMO
 Multipliers depend on phase: Requirements; System Design; Detailed
Design; Code and Unit Test; Integrate & Test; Maintenance
Steps in Intermediate COCOMO

 Basic COCOMO steps:


 Estimate number of instructions
 Determine project complexity parameters: A, B
 Determine level of difficulty that characterizes the project
 New step:
 Determine cost drivers
 15 cost drivers c1 , c1 …. c15
 Calculate effort
 Effort = A * KDSIB * c1 * c1 …. * c15
Calculation of Effort in Intermediate
COCOMO Basic formula:
B
Effort = A * KDSI * c1 * c1 ….* c15
Effort is measured in PM (person months, 152 productive
hours (8 hours per day, 19 days/month, less weekends, holidays,
etc.)
A, B are constants based on the complexity of
the project

Project Complexity A B
Simple 2.4 1.05
Semi-Complex 3.0 1.12
Complex 3.6 1.20
Intermediate COCOMO: 15 Cost
drivers
 Product Attributes  Project Attributes
 Required reliability
 Database size
 Use of modern
 Product complexity programming
 Computer Attributes practices
 Execution Time constraint • Rated
Use of on a qualitative
software tools scale
 Main storage constraint between “very low” and
 Required development
 Virtual Storage volatility “extra high”
 Turnaround time schedule

• Associated values are
Personnel Attributes
 Analyst capability
multiplied with each other.
 Applications experience
 Programmer capability
 Virtual machine experience
 Language experience
COCOMO 
II
Revision of COCOMO I in 1997
 Provides three models of increasing detail
 Application Composition Model
 Estimates for prototypes based on GUI builder tools and existing
components
 Early Design Model
 Estimates before software architecture is defined
 For system design phase, closest to original COCOMO, uses function
points as size estimation
 Post Architecture Model
 Estimates once architecture is defined
 For actual development phase and maintenance; Uses FPs or SLOC as size
measure
 Estimator selects one of the three models based on current state
of the project.

You might also like