0% found this document useful (0 votes)
3 views

chpt7

Project management involves planning, organizing, and controlling resources to achieve specific goals within constraints such as scope, time, quality, and budget. Metrics play a crucial role in project costing and performance assessment, categorizing into project management measurements, indicators of project success, and business success. Various estimation techniques, including COCOMO models and Function Point Analysis, are utilized to predict project costs and resources based on historical data and project characteristics.

Uploaded by

jatinchauhan6560
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

chpt7

Project management involves planning, organizing, and controlling resources to achieve specific goals within constraints such as scope, time, quality, and budget. Metrics play a crucial role in project costing and performance assessment, categorizing into project management measurements, indicators of project success, and business success. Various estimation techniques, including COCOMO models and Function Point Analysis, are utilized to predict project costs and resources based on historical data and project characteristics.

Uploaded by

jatinchauhan6560
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Chapter-7 Project Economics Concept Of Project Management 2018

Project management is the discipline of planning, organizing, motivating, and controlling resources to
achieve specific goals. A project is a temporary endeavor with a defined beginning and end undertaken to meet
unique goals and objectives, typically to bring about beneficial change or added value.
The primary challenge of project management is to achieve all of the project goals and objectives while
honoring the preconceived constraints. The primary constraints are scope, time, quality and budget The
secondary and more ambitious challenge is to optimize the allocation of necessary inputs and integrate them to
meet pre-defined objectives.
Two forefathers of project management are Henry Gantt, called the father of planning and control
techniques, who is famous for his use of the Gantt chart as a project management tool and Henri Fayol for his
creation of the five management functions that form the foundation of the body of knowledge associated with
project and program management.
In 1956, the American Association of Cost Engineers (now AACE International; the Association for the
Advancement of Cost Engineering) was formed by early practitioners of project management and the associated
specialties of planning and scheduling, cost estimating, and cost/schedule control (project control)
The International Project Management Association (IPMA) was founded in Europe in 1967, as a federation of
several national project management associations.

Project costing based on Metrics


What is Metrics?
A metric, by definition, is any type of measurement used to gauge some quantifiable component of
performance. A metric can be directly collected through observation, such as number of days late, or number of
software defects found; or the metric can be derived from directly observable quantities, such as defects per
thousand lines of code, or a cost performance index (CPI). When used in a monitoring system to assess project or
program health, a metric is called an indicator, or a key performance indicator (KPI).
A performance metric is a measure of an organization's activities and performance. Performance metrics
should support a range of stakeholder needs from customers, shareholders to employees. While traditionally
many metrics are financed based, inwardly focusing on the performance of the organization, metrics may also
focus on the performance against customer requirements and value. In project management, performance metrics
are used to assess the health of the project and consist of the measuring of seven criteria: safety, time, cost,
resources, scope, quality, and actions.
Developing performance metrics usually follows a process of:
1. Establishing critical processes/customer requirements
2. Identifying specific, quantifiable outputs of work
3. Establishing targets against which results can be scored
Project metrics can be categorized into three main categories:
1. Pure project management measurements (Example: Estimation accuracy)
2. Indicators of project success (Example: Stakeholder satisfaction)
3. Indicators of business success (Example: ROI).
The following criteria are the most common tactical measures people want to be updated about:

Gaurav sardhara Page 1


Chapter-7 Project Economics Concept Of Project Management 2018

Tactical
Question Answered Sample Indicator
Measure

Time How are we doing against the Schedule Performance Index (SPI) = Earned Value ÷
schedule? Planned Value

Cost How are we doing against the Cost Performance Index (CPI) = Earned Value ÷ Actual
budget? Cost

Resources Are we within anticipated limits Amount of hours overspent per software iteration
of staff-hours spent?

Scope Have the scope changes been Number of Change Requests


more than expected?

Quality Are the quality problems being Number of defects fixed per user acceptance test
fixed?

Action Are we keeping up with our Number of action items behind schedule for resolution
Items action item list?

Metric is an inevitable part of any piece of work being performed. It's a system in place to measure the
excellence or rather performance of work delivered. Any work that is not controlled and measured can prove
equivalent of incorrect work being delivered. Technology grows in a tremendous pace that enterprises always
strive in keeping defined project metrics.
Project metrics can be stated as pre-defined or identified measures or benchmarks, which the deliverable
is supposed to attain in order to get the expected value. With clearly defined project metrics, the business groups
are able to assess the success of a project. Though certain unstated measures like, if the project was delivered on
time and within the budget, existed ever since the advent of enterprises, the need for more analytics in this area
has seen a high spike. There are different types of project metric analysis systems in place across the industry like
costing, resource, hours based etc. Let me take you through some common project metrics that is much related to
the person hours delivered in a project.
Effort Variance (Ev): It's a derived metric, which gives you an alert of having a control on the project. Let
there be a project A with the below current attributes:

Planned effort: 100


Actual effort: 150
Project progress percentage: 50

Therefore, at 50 % - 150 Hrs taken


then at 100 % - X Hrs will be taken

Gaurav sardhara Page 2


Chapter-7 Project Economics Concept Of Project Management 2018

X = (100*150)/50 = 300 Hrs. where X is a predicted value for effort within which the project is going to complete.
Hence, the variance Ev = ((Actual - Planned)/Planned)*100
= ((300-100)/100)*100 = 200 %

The variance predicted indicates that the project requires attention or it would complete at a much higher cost in
terms of the effort delivered.
Schedule Variance (Sv): The Schedule variance also has same calculation in which the number of days is
considered instead of the hours.

Weighted Defect Rate: or WDR is a defect metric calculated based on weightage assigned to the reported bugs. The
weightage depends on two factors - severity and reporter.

Weightage against severity in descending order: Block, Crash, Major, Minor

Weightage against the reporter in descending order: Client, SQC, Team

And the rate is calculated against the total planned hours for the project

Quality Costs:
Cost of Quality: It's the total time spent on review activities in the project. For example, requirements review,
design review, code review, test plan review, team meetings for clarifications and client calls etc.

COQ = (Total Review hrs / Total project planned hours)*100

Cost of Detection: The total time spent on testing activity is considered as the cost of detection.

Cost of Detection= (Total Testing Hrs/ Total Project planned Hrs)*100

Cost of Failure: The total time spent on rework in the project is considered as the cost of failure. Rework include
bug fixing, design change, test plan change etc.

Cost of Failure (Cost of Poor Quality- CoPQ) = (Total Rework or bug fixing Hrs/ Total Project Planned hours)*100

Gaurav sardhara Page 3


Chapter-7 Project Economics Concept Of Project Management 2018

Estimation Technique

Estimation of resources, cost and schedule of software development are very important.
To make a good estimate requires experience and expertise to convert qualitative measures to quantitative form.
Factors like Project size, Amount of risk involved etc are affecting the accuracy and efficacy of estimates.

The following are the different techniques for estimation


 Decomposition Technique
 Empirical Estimation Models
 Automated Estimation Tools

Decomposition Technique
Here we subdivide the problem into small problems. When all the small problems are solved the main problem is
solved.
Lines of Code
Function Point
LOC (Lines of Code), FP(Function Point) estimation methods consider the size as the measure. In LOC the cost is
calculated based on the number of lines. In FP the cost is calculated based on the number of various functions in
the program.
Empirical Estimation Models

Estimation models use empirically derived formulas to predict the estimates. Here we conduct a study on some
completed projects. From those observation we form some statistical formulas. We can use these formulas to
estimate the cost of other projects. The structure of empirical estimation models is a formula, derived from data
collected from past software projects, that uses software size to estimate effort.Size, itself, is an estimate,
described as either lines of code (LOC) or function points (FP).

Constructive Cost Model (COCOMO)


Model 1 : Basic COCOMO
Model 2 : Intermediate COCOMO
Model 3 : Advanced COCOMO
The basic COCOMO model computes software development effort (and cost) as a function of program size
expressed in estimated Lines of Code. The intermediate COCOMO model computes software development effort as
a function of program size and a set of cost drivers that include hardware, personnel attributes. The advanced
COCOMO model incorporate all characteristics of the intermediate version with an assessment of cost drivers
impact on each step (analysis,design,coding etc) of the software engineering process.

Gaurav sardhara Page 4


Chapter-7 Project Economics Concept Of Project Management 2018

Automated Estimation Tools


The decomposition techniques and the empirical estimation models can be implemented using software.
These automated tools allow the planner to estimate the cost and effort and will also give important information
like delivery date, staffing.
Algorithmic Method
The algorithmic method is designed to provide some mathematical equations to perform software
estimation. These mathematical equations are based on research and historical data and use inputs such as Source
Lines of Code (SLOC), number of functions to perform, and other cost drivers such as language, design
methodology, skill-levels, risk assessments, etc. The algorithmic methods have been largely studied and there are
a lot of models have been developed, such as COCOMO models, Putnam model, and function points based models.
Advantages:
It is able to generate repeatable estimations.
It is easy to modify input data, refine and customize formulas.
It is efficient and able to support a family of estimations or a sensitivity analysis.
It is objectively calibrated to previous experience.
Disadvantages:
It is unable to deal with exceptional conditions, such as exceptional personnel in any software cost estimating
exercises, exceptional teamwork, and an exceptional match between skill-levels and tasks.
Poor sizing inputs and inaccurate cost driver rating will result in inaccurate estimation.
Some experience and factors can not be easily quantified.

COCOMO Models
One very widely used algorithmic software cost model is the Constructive Cost Model (COCOMO).
The basic COCOMO model has a very simple form:
MAN-MONTHS = K1* (Thousands of Delivered Source Instructions) K2
Where K1 and K2 are two parameters dependent on the application and development environment.
Estimates from the basic COCOMO model can be made more accurate by taking into account other factors
concerning the required characteristics of the software to be developed, the qualification and experience of the
development team, and the software development environment. Some of these factors are:
Complexity of the software
Required reliability
Size of data base
Required efficiency (memory and execution time)
Analyst and programmer capability
Experience of team in the application area
Experience of team with the programming language and computer
Use of tools and software engineering practices

Gaurav sardhara Page 5


Chapter-7 Project Economics Concept Of Project Management 2018

Many of these factors affect the person months required by an order of magnitude or more. COCOMO assumes that
the system and software requirements have already been defined, and that these requirements are stable. This is
often not the case.
COCOMO model is a regression model. It is based on the analysis of 63 selected projects. The primary input is
KDSI. The problems are:
In early phase of system life-cycle, the size is estimated with great uncertainty value. So, the accurate cost estimate
can not be arrived at.
The cost estimation equation is derived from the analysis of 63 selected projects. It usually has some problems
outside of its particular environment. For this reason, the recalibration is necessary.
The first version of COCOMO model was originally developed in 1981. Now, it has been experiencing increasing
difficulties in estimating the cost of software developed to new life cycle processes and capabilities including
rapid-development process model, reuse-driven approaches, object-oriented approaches and software process
maturity initiative.
For these reasons, the newest version, COCOMO 2.0, was developed. The major new modeling capabilities
of COCOMO 2.0 are a tailorable family of software size models, involving object points, function points and source
lines of code; nonlinear models for software reuse and reengineering; an exponent-driver approach for modeling
relative software diseconomies of scale; and several additions, deletions, and updates to previous COCOMO effort-
multiplier cost drivers. This new model is also serving as a framework for an extensive current data collection and
analysis effort to further refine and calibrate the model's estimation capabilities.

Function Point Analysis Based Methods


From above two algorithmic models, we found they require the estimators to estimate the number of SLOC in
order to get man-months and duration estimates. The Function Point Analysis is another method of quantifying
the size and complexity of a software system in terms of the functions that the systems deliver to the user. A
number of proprietary models for cost estimation have adopted a function point type of approach, such
asESTIMACS and SPQR/20.
The function point measurement method was developed by Allan Albrecht at IBM and published in 1979. He
believes function points offer several significant advantages over SLOC counts of size measurement. There are two
steps in counting function points:
Counting the user functions. The raw function counts are arrived at by considering a linear combination of five
basic software components: external inputs, external outputs, external inquiries, logic internal files, and external
interfaces, each at one of three complexity levels: simple, average or complex....The sum of these numbers,
weighted according to the complexity level, is the number of function counts (FC).
Adjusting for environmental processing complexity. The final function points is arrived at by multiplying FC by an
adjustment factor that is determined by considering 14 aspects of processing complexity. This adjustment factor
allows the FC to be modified by at most 35% or -35%.
The collection of function point data has two primary motivations. One is the desire by managers to monitor levels
of productivity. Another use of it is in the estimation of software development cost.

Gaurav sardhara Page 6


Chapter-7 Project Economics Concept Of Project Management 2018

There are some cost estimation methods which are based on a function point type of measurement, such as
ESTIMACS and SPQR/20. SPQR/20 is based on a modified function point method. Whereas traditional function
point analysis is based on evaluating 14 factors, SPQR/20 separates complexity into three categories: complexity
of algorithms, complexity of code, and complexity of data structures. ESTIMACS is a propriety system designed to
give development cost estimate at the conception stage of a project and it contains a module which estimates
function point as a primary input for estimating cost.
The advantages of function point analysis based model are:
Function points can be estimated from requirements specifications or design specifications, thus making it
possible to estimate development cost in the early phases of development.
Function points are independent of the language, tools, or methodologies used for implementation.
Non-technical users have a better understanding of what function points are measuring since function points are
based on the system user's external view of the system
LOC

Lines of code (often referred to as Source Lines of Code, SLOC or LOC) is a software metric used to measure the
amount of code in a software program. LOC is typically used to estimate the amount of effort that will be required
to develop a program, as well as to estimate productivity once the software is produced. Measuring software size
by the number of lines of code has been in practice since the inception of software.
There are two major types of LOC measures: physical LOC and logical LOC. The most common definition of
physical LOC is a count of "non-blank, non-comment lines" in the text of the program's source code. Logical LOC
measures attempt to measure the number of "statements", but their specific definitions are tied to specific
computer languages (one simple logical LOC measure for C-like languages is the number of statement-terminating
semicolons). It is much easier to create tools that measure physical LOC, and physical LOC definitions are easier to
explain. However, physical LOC measures are sensitive to logically irrelevant formatting and style conventions,
while logical LOC is less sensitive to formatting and style conventions. Unfortunately, LOC measures are often
stated without giving their definition, and logical LOC can often be significantly different from physical LOC.
There are several cost, schedule, and effort estimation models which use LOC as an input parameter, including the
widely-used Constructive Cost Model (COCOMO) series of models invented by Dr. Barry Boehm. While these
models have shown good predictive power, they are only as good as the estimates (particularly the LOC estimates)
fed to them.



Gaurav sardhara Page 7

You might also like