0% found this document useful (0 votes)
59 views

PS.1734 Evaluation of Baseline Schedule Metrics For Successful Project Schedule Performance

Uploaded by

faaderinto5964
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
59 views

PS.1734 Evaluation of Baseline Schedule Metrics For Successful Project Schedule Performance

Uploaded by

faaderinto5964
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

2014 AACE® INTERNATIONAL TECHNICAL PAPER

PS.1734

Evaluation of Baseline Schedule Metrics for


Successful Project Schedule Performance
Jin Ouk Choi and Anthony J. Gonzales
Abstract—The development of a reasonable baseline schedule is a challenge for some
construction professionals. Some construction projects experience schedule impacts or delays
as a result of fatally flawed baseline schedules that produce an unreasonable forecasted
completion. Accordingly, some organizations, public entities, private firms, software companies
and consultants have developed checklists or guidelines for evaluating the mechanics of
baseline schedules to improve the likelihood of forecasting a reasonable completion date.
However, these guidelines have yet to be substantiated. The purpose of this paper is to identify
and evaluate a list of industry-recognized metrics to determine whether a schedule is fatally
flawed or suggests a reasonable forecast. Accordingly, this paper addresses the correlation
between current Industry Metrics and developing a reasonable schedule. The findings are
addressed after evaluating 27 current Industry Metrics against eight completed case projects
within the commercial and university building sectors. The case projects are divided into two
groups based on amount of delay. The analysis results show that between two groups there
exist significant baseline schedule quality differences.

PS.1734.1
Copyright © AACE® International.
This paper may not be reproduced or republished without expressed written consent from AACE® International
2014 AACE® INTERNATIONAL TECHNICAL PAPER

Table of Contents

Abstract ....................................................................................................................................... 1
List of Figures .............................................................................................................................. 2
List of Tables ................................................................................................................................ 2
Introduction ................................................................................................................................ 3
Methodology ............................................................................................................................... 3
Analysis Results and Findings ...................................................................................................... 7
Needs for Future Research ......................................................................................................... 13
Conclusion ................................................................................................................................... 14
Reference .................................................................................................................................... 15

List of Figures

Figure 1 – Research Methodology .............................................................................................. 4

List of Tables

Table 1 – Selected Quantitative Industry Metrics ...................................................................... 7


Table 2 – Schedule Quality Analysis Result (Group A) ................................................................ 10
Table 3 – Schedule Quality Analysis Result (Group B) ................................................................ 11

PS.1734.2
Copyright © AACE® International.
This paper may not be reproduced or republished without expressed written consent from AACE® International
2014 AACE® INTERNATIONAL TECHNICAL PAPER

Introduction

Schedules can be a great tool for tracking and forecasting the execution plan, completion date,
progress, resources, activity durations, sequence and costs of a construction project. In order
for a schedule to forecast a reasonable plan and completion date, it must contain a contractors’
scope of work, be properly updated and maintained, and have the proper schedule mechanics
(mechanics refers to the composition of a schedule network).

Although construction scheduling is a topic that has been well developed over several decades,
many construction projects experience schedule impacts or delays as a result of fatally flawed
baseline schedules that produce an unreasonable forecasted completion. Generally, a baseline
schedule is considered fatally flawed when its mechanics or insufficient/incomplete scope of
work prevent the scheduler from forecasting a reasonable date for completion. Schedulers
often rely on the project contract, plans, specifications, personal/project experiences and
education to develop the mechanics and scope of work to forecast a reasonable completion
date within a baseline schedule. To improve the likelihood of forecasting a reasonable
completion date, some organizations, public entities, private firms, software companies and
consultants have developed checklists or guidelines for evaluating the mechanics of baseline
schedules. These guidelines, though, have yet to be substantiated. The purpose of this paper
then is to identify and evaluate a list of industry-recognized metrics to determine if a schedule
is fatally flawed or offers a reasonable forecast.

Methodology

The following methodology was used to determine if current industry-recognized metrics could
determine whether a schedule was fatally flawed or potentially depict a reasonable forecast.
The methodology followed a seven (7) step process: 1) Literature Review; 2) Literature Analysis;
3) Metric Compilation; 4) Metric Selection; 5) Case Project Collection; 6) Case Project Screening
and Classification; and 7) Industry Metric Analysis. Figure 1 depicts the seven (7) step process.

PS.1734.3
Copyright © AACE® International.
This paper may not be reproduced or republished without expressed written consent from AACE® International
2014 AACE® INTERNATIONAL TECHNICAL PAPER

Figure 1 – Research Methodology

1. Literature Review
Some readers may point out that metrics to check the quality of a schedule already exist. As
such, an extensive interview process and literature review was performed. Industry
professionals were interviewed to identify which schedule metrics, checklists, guidelines, best
practices and publications currently exist to demonstrate the impact the metrics have on a
schedule. Professors from the Department of Civil, Architectural and Environmental
Engineering program at The University of Texas at Austin, project managers, schedule forensic
consultants and owners’ representatives were interviewed. In addition, the U.S. Department of
Veterans Affairs (VA) and the U.K. Royal Institution of Chartered Surveyors (RICS) were
researched to find standards or recommended practices related to schedule metrics.

2. Literature Analysis
Recommended literature was obtained, reviewed, and analyzed. Literature was compiled and
reviewed to identify referenced schedule metrics. Information obtained from the literature
included but may not be limited to the following:

 focus/purpose of the literature


 intended audience of the literature
 contents
 methodology of the literature
 findings of the literature
 specified quantitative metrics

PS.1734.4
Copyright © AACE® International.
This paper may not be reproduced or republished without expressed written consent from AACE® International
2014 AACE® INTERNATIONAL TECHNICAL PAPER

The referenced information was imported into a table for comparative purposes. Rows and/or
columns were added when a new metric was found or literature was added. This facilitated the
documentation of the metrics relative to the information source.

3. Metric Compilation
The referenced schedule metrics from each publication was compiled, organized, and analyzed.
Redundant metrics were evaluated and removed from the compilation. The remaining items
were used to form a ‘master list’ of possible schedule metrics to consider. Accordingly, over 100
quantitative schedule metrics were documented, compiled, and analyzed.

4. Metric Selection
Compiled schedule metrics were categorized and selected. Schedule metrics appearing within
multiple publications/sources and seemed reasonable for evaluating a baseline schedule were
identified, selected, and labeled “Industry Metrics.” Analyzing over 100 quantitative metrics
would have been difficult and time consuming since there is no research regarding the impact
of each metric. Accordingly, Industry Metrics were evaluated based on the recommended
thresholds identified within the literature. Industry Metrics were considered for exclusion in
one or more of the following scenarios:

1. The metric did not have a recommended threshold in the literature researched.
2. A metric was not appropriate for analyzing a baseline schedule. For example, earned
value, progress, and benchmarking metrics were not considered in the evaluation as
they are typically involved in the evaluation of schedule updates.

It is reasonable for some of the metrics not to have recommended thresholds. Examples include
the total number of tasks (activities), number of relationships, number of milestones, average
duration, and so forth. These metrics are needed for the basis of other metrics analysis.
However, these metrics were not included in Industry Metrics because these metrics are
intended to understand a schedule rather than measure a schedule quality.

On the other hand, there were many metrics having no recommended thresholds even though
one should have existed (e.g., the number of highest concurrent tasks, number of redundant
links, path convergence/divergence, number of soft constraints, average total float, etc.). After
proper (scientific) study by identifying threshold, these metrics may be appropriate to measure
a schedule quality. These metrics were also excluded, as no recommended threshold existed
within the researched literature.

5. Case Study Project Collection


Case study projects were collected from multiple sources. The project information was
obtained to analyze the characteristics. A questionnaire was developed and transmitted to
several industry professionals to obtain project baseline schedules and information. The
information consisted of the following:

 Approved Baseline Schedule


PS.1734.5
Copyright © AACE® International.
This paper may not be reproduced or republished without expressed written consent from AACE® International
2014 AACE® INTERNATIONAL TECHNICAL PAPER

 Contract Value
 Description of Project (Project Type, Location, Owner, Contractor, etc.)
 Forecasted (from Baseline) and Actual Project Duration
 Forecasted (from Baseline) and Actual Project Start Date and Year
 Forecasted (from Baseline) and Actual Project Completion Date and Year
 Reported Project Delays & Delay Categories (Force Majeure, Weather Delays, Labor
Strikes, Owner Delay, Scope Change/Change Order, etc.)

Afterward, the case study projects were sorted by their schedule performance and renamed by
project number.

6. Case Study Project Screening and Classification


The received project information was reviewed, organized, and classified. Projects with
insufficient information were excluded from the analysis. In total, 8 case study projects were
classified into two groups according to the following criteria:

1. Average percentage of delay relative to planned project duration;


2. A distribution of the average percentage of delay was checked for inflection points and
extreme variation. It is assumed that each schedule should have a normal distribution of
scores (delay amount) and equal or similar variances.

Ultimately, the threshold for dividing into one of the two groups was whether the average
schedule delay for a case study project was more or less than 10%. Case study projects
experiencing less than 10% were categorized as Group A. Group A projects had an average of 6%
schedule delay. Case study projects experiencing greater than 10% schedule delay were
categorized as Group B. Group B projects had an average of 50% schedule delay.

7. Industry Metric Analysis


Industry metrics were analyzed by evaluated case projects through the following process:

1. Collected baseline or initial stage schedules were selected for the export.
2. Data types were identified and selected for the export (activities, activity relationships,
resources, and resource assignments).
3. Template was created to export data.
4. Metric values were calculated by their calculation method as addressed in Table 1.
5. Schedule data was exported and compiled into a Spreadsheet format (XLS).
6. Most of the metrics analysis values were calculated easily by excel features such as pivot
table, excel function (sum, average, lookup, countif, if, etc.), and finding feature.
7. Each returned metric analysis value was reviewed and compared to the recommended
Industry Metric thresholds identified in the literature.
8. Metric analysis values that were within the Industry Metric thresholds were identified
and labeled as “Pass.” Similarly, metric analysis values that were NOT within the
Industry Metric thresholds were identified and labeled as “Fail.”

PS.1734.6
Copyright © AACE® International.
This paper may not be reproduced or republished without expressed written consent from AACE® International
2014 AACE® INTERNATIONAL TECHNICAL PAPER

9. The number of “Pass” and “Fail” metrics were tabulated and analyzed.

Analysis Results and Findings

In total, 27 quantitative Industry Metrics were selected along with the published recommended
thresholds. Table 1 provides a list of the 27 quantitative Industry Metrics with the
corresponding literature referenced, a brief description of the metric, and how to calculate or
identify the metric within a schedule.

No Industry Metric Referenced Literature [number] Industry Metric Description How to


- Organization Summary Identify/Calculate?
1 Activity ID A. [6] – GAO Every activity should have a Percentage of activities
B. [10] – Naval Air unique identification number. with unique Activity ID
C. [14] – PMI
D. [15] – UT OFPC

2 Activity Name A. [6] – GAO Every activity should have a Percentage of activities
(Unique) B. [10] – Naval Air unique name. with unique Activity Name
C. [11] – NDIA
3 Activity Codes / A. [10] – Naval Air Every activity should have an Percentage of activities
WBS / Reference B. [14] – PMI activity code which include a with activity code
Code WBS by location, floor, phase,
etc.
4 Responsibility / A. [10] – Naval Air A schedule should have a Find Responsibility /
Organizational / B. [15] – UT OFPC Responsibility, Organizational or Organizational /
Functional Functional Directory. Functional Directory
Directory
5 Responsibility / A. [10] – Naval Air Every activity should assigned Percentage of activities
Organizational / B. [15] – UT OFPC by Responsibility / with Responsibility /
Functional Codes Organizational / Functional Organizational /
code. Functional Code
6 Ratio of Detail A. [6] – GAO A rough indicator of the level of Divide Number of Detail
Activities to planning detail. Activities by Number of
Milestones Milestones
7 Milestones A. [11] – NDIA Every Milestone should have at Percentage of milestone
Missing least one predecessor and one missing predecessor or
Predecessor or successor. successor
Successor
8 Milestones with A. [6] – GAO Milestone must have no Percentage of milestone
Resources B. [11] – NDIA resource with resource assigned
C. [14] – PMI
9 Milestones with A. [2] – DCMA Milestone must have no Percentage of milestone
Duration B. [6] - GAO duration with duration assigned
C. [9] - NASA
D. [11] - NDIA
E. [14] - PMI
10 Start and Finish A. [6] - GAO A project start milestone and a Find project start
Milestones B. [11] - NDIA project finish milestone should milestone; find project
C. [14] - PMI be present in the schedule. finish milestone

PS.1734.7
Copyright © AACE® International.
This paper may not be reproduced or republished without expressed written consent from AACE® International
2014 AACE® INTERNATIONAL TECHNICAL PAPER

11 High Duration A. [2] - DCMA An activity greater than 44 Percentage of activities


B. [4] - DOD working days (2 months) needs greater than 44 working
C. [6] - GAO intention. days
D. [10] – Naval Air
E. [11] - NDIA
F. [14] - PMI
12 Extreme Duration A. [10] – Naval Air An activity greater than 120 Percentage of activities
B. [11] - NDIA (125 – Naval Air) working days greater than 120 working
needs high intention. days
13 Project Calendar A. [6] - GAO At the project level, project Find Project Calendar;
B. [10] – Naval Air calendar must constitute the Project Calendar is
C. [11] - NDIA primary or default calendar for assigned?
D. [14] - PMI the project.
14 Holidays A. [6] - GAO Holidays and other exceptions Is there Holidays? Or other
B. [10] – Naval Air are assigned in the calendar. Exceptions?
C. [11] - NDIA
D. [15] – UT OFPC
15 Basic Relationship A. [2] - DCMA Every activity should have a Percentage of activities
(Open Ends) B. [4] - DOD predecessor and a successor. missing predecessor or
C. [5] - DOD successor
D. [6] - GAO
E. [10] - Naval Air
F. [11] - NDIA
G. [14] - PMI
H. [15] - OFPC
16 Relationship Type A. [2] - DCMA Finish to Start relationship percentage of F-to-S
(Finish to Start) B. [4] – DOD should be majority of relationships
C. [6] - GAO relationship
D. [10] - Naval Air
E. [11] - NDIA
F. [14] - PMI
17 Critical Path Test A. [2] - DCMA A schedule should react by Increase several activities
(Horizontal B. [4] - DOD increasing Activities’ durations durations by 500 or 1000
Traceability) C. [5] - DOD by improbable amounts (500 or days
D. [6] - GAO 1,000 days)
E. [10] - Naval Air
F. [11] - NDIA
G. [14] - PMI
H. [15] - OFPC
18 Activities on A. [4] - DOD A schedule should not overly Percentage of Activities on
Critical Path simplified - adequate # of Critical Path
activities should be on the
critical path.
19 Link in Summary A. [6] - GAO Every summary Percentage of summary
Activity / B. [9] - NASA activities/hammock should not activities / hammock
Hammock / Level C. [11] - NDIA have relationship. predecessor or successor
of effort D. [10] - Naval Air
20 Hard Constraints A. [2] - DCMA Hard Constraints (Must start on, Percentage of hard
B. [4] - DOD must finish on, finish no later constraints
C. [6] - GAO than, start no later) should be
D. [10] - Naval Air used carefully.
E. [11] - NDIA
F. [14] - PMI
G. [15] - OFPC
21 Constraints % A. [5] - DOD Significant number of Percentage of constraints
B. [9] - NASA constraints in the schedule is
one of schedule indicator.

PS.1734.8
Copyright © AACE® International.
This paper may not be reproduced or republished without expressed written consent from AACE® International
2014 AACE® INTERNATIONAL TECHNICAL PAPER

22 Resources A. [2] - DCMA All activities with durations Percentage of activities


Rate/Prices B. [6] – GAO greater than zero should have with resource assigned
Assigned C. [9] - NASA dollars or hours assigned.
D. [10] - Naval Air
E. [11] - NDIA
F. [14] - PMI
23 Resource A. [14] - PMI A resource library or dictionary Find a resource library or
Library/Dictionary should be organized into some dictionary
meaningful structure.
24 High Float A. [2] - DCMA An activity with total float Percentage of activities
B. [4] - DOD greater than 44 working days with total float greater
C. [6] - GAO needs intention. than 44 working days
D. [10] - Naval Air
E. [11] - NDIA
25 Extreme Float A. [11] - NDIA An activity with total float Percentage of activities
greater than 120 working days with total float greater
needs high intention. than 120 working days
26 Lags A. [2] - DCMA A lag shall not be used or used Percentage of lag in
B. [4] – DOD rarely. predecessor logic
C. [6] - GAO relationships
D. [10] - Naval Air
E. [14] - PMI
27 Lead A. [2] - DCMA A lead shall not be used or used Percentage of lead in
B. [4] – DOD rarely. predecessor logic
C. [6] - GAO relationships
D. [10] - Naval Air
E. [14] - PMI
Table 1 – Selected Quantitative Industry Metrics

Industry Metrics were derived from reviewed publications and were not tested or substantiated
prior to selection. From the extensive literature review, currently identified metrics appear to
be not based on substantiated evidence but somewhat subjective, rule-of-thumb, or based on
the experience of the authors. The referenced metrics did not assess the impact of the metrics
on the quality of the schedule mechanics and no research or data was found to support the
publications recommended metrics. Accordingly, analysts should use caution when relying on
these Industry Metrics and the published corresponding thresholds.

Ultimately, projects within the commercial or higher education sector of construction were
selected to analyze. The case study projects consisted of the following general characteristics:

 commercial or university project schedule


 medium size project (range of estimated duration: between approximately 300 to 1,200
calendar days)
 developed by different contractors or schedulers
 developed in Primavera P3 or P6 versions
 schedules were developed between 2004 to 2011
 projects were completed between 2008 to 2013
 baseline or initial stage schedules were collected

PS.1734.9
Copyright © AACE® International.
This paper may not be reproduced or republished without expressed written consent from AACE® International
2014 AACE® INTERNATIONAL TECHNICAL PAPER

All of the case study projects had delays. It would have been more advantageous if the
comparison could have been made between case projects with and without delays. However,
the amount of delay variance between the two groups was significant enough to identify their
schedule quality differences.

Before the schedule analysis, the 8 schedules were divided into two groups by their schedule
performance. (Originally, the authors collected 12 schedules. However, four schedules failed to
qualify for this analysis due to a lack of background data such as actual duration, delay amount
caused by change order, weather, etc.) The threshold for dividing the groups was a 10%
schedule delay. Group A projects had an average of 6% and Group B projects had an average of
50% schedule delay. Projects were sorted by their schedule performance and named
accordingly as Project 1 to 8. The individual “PASS” or “FAIL” results and schedule performance
of Project 1, 2, and 3 (Group A) are presented in Table 2. The “PASS” or “FAIL” results and
schedule performance of Projects 4, 5, 6, 7 and 8 (Group B) are addressed in Table 3.

No Industry Metric Threshold PROJECT 1 PROJECT 2 PROJECT 3


Performance (% Delay & Rank) 1.04% 1 1.06% 2 1.08% 3
1 Activity ID 100.00% 100.00% PASS 100.00% PASS 100.00% PASS
2 Activity Name (Unique) 100.00% 95.10% FAIL 100.00% PASS 96.23% FAIL
Activity Codes/WBS/Reference
3 100.00% 100.00% PASS 100.00% PASS 100.00% PASS
Code
Responsibility/Organizational/Fu
4 Y n FAIL n FAIL n FAIL
nctional Directory
Responsibility/Organizational/Fu
5 100.00% 81.21% FAIL 94.56% FAIL 79.72% FAIL
nctional Codes
Ratio of Detail Activities to Low<=2, 5.07017
6 PASS 17.375 FAIL 3.97619 PASS
Milestones 10<=High 5
Milestones Missing Predecessor
7 95.00% 98.21% PASS 71.43% FAIL 97.62% PASS
or Successor
8 Milestones with Resources 0.00% 0.00% PASS 0.00% PASS 0.00% PASS
9 Milestones with Duration 0.00% 0.00% PASS 0.00% PASS 0.00% PASS
10 Start and Finish Milestones 2 1 FAIL 1 FAIL 2 PASS
11 High Duration 5.00% 2.02% PASS 4.08% PASS 3.32% PASS
12 Extreme Duration 0.00% 0.29% FAIL 0.00% PASS 0.95% FAIL
13 Project Calendar Y Y PASS y PASS Y PASS
14 Holidays Y y PASS y PASS Y PASS
15 Basic Relationship (Open Ends) 5.00% 0.00% PASS 0.00% PASS 0.00% PASS
Relationship Type (Finish to
16 90.00% 95.06% PASS 93.62% PASS 98.04% PASS
Start)
Critical Path Test (Horizontal
17 Y y PASS y PASS y PASS
Traceability)
18 Activities on Critical Path 15~20% 8.48% FAIL 19.23% PASS 24.88% FAIL
PS.1734.10
Copyright © AACE® International.
This paper may not be reproduced or republished without expressed written consent from AACE® International
2014 AACE® INTERNATIONAL TECHNICAL PAPER

Link in Summary
19 Activity/Hammock/Level of 3.00% 0.29% PASS 0.00% PASS 0.95% PASS
effort
20 Hard Constraints 5.00% 0.00% PASS 0.00% PASS 0.00% PASS
21 Constraints % 10%, 15% 0.00% PASS 0.00% PASS 0.00% PASS
22 Resources Rate/Prices Assigned 100.00% 0.00% FAIL 0.00% FAIL 0.00% FAIL
23 Resource Library/Dictionary Y y PASS y PASS y PASS
24 High Float 5.00% 36.99% FAIL 9.52% FAIL 16.10% FAIL
25 Extreme Float 0.00% 4.91% FAIL 2.04% FAIL 0.00% PASS
26 Lags 5.00% 4.94% PASS 7.23% FAIL 3.59% PASS
27 Lead 0.00% 0.00% PASS 0.00% PASS 0.00% PASS
PASS 18 PASS 18 PASS 20
FAIL 9 FAIL 9 FAIL 7
Table 2 – Schedule Quality Analysis Result (Group A)

Industry Threshol
No PROJECT 4 PROJECT 4 PROJECT 5 PROJECT 6 PROJECT 5
Metric d
Performance
(% Delay & 1.16% 4 1.21% 5 1.23% 6 1.93 7 1.99% 8
Rank)
1 Activity ID 100.00% 100.00% PASS 100.00% PASS 100.00% PASS 100.00% PASS 100.00% PASS
Activity Name
2 100.00% 100.00% PASS 100.00% PASS 95.74% FAIL 76.62% FAIL 53.64% FAIL
(Unique)
Activity
3 Codes/WBS/Re 100.00% 0.00% FAIL 100.00% PASS 100.00% PASS 92.51% FAIL 100.00% PASS
ference Code
Responsibility/
Organizational/
4 Y n FAIL n FAIL N FAIL n FAIL y PASS
Functional
Directory
Responsibility/
Organizational/
5 100.00% 0.00% FAIL 0.00% FAIL 0.00% FAIL 0.00% FAIL 57.82% FAIL
Functional
Codes
Ratio of Detail
Low<=2, 40.9677 101.333
6 Activities to FAIL 10.26 FAIL * FAIL FAIL 4.55 PASS
10<=High 42 33
Milestones
Milestones
Missing
7 95.00% 80.65% FAIL 90.24 FAIL * FAIL 66.66% FAIL 98.30% PASS
Predecessor or
Successor
Milestones
8 0.00% * FAIL 0% PASS * FAIL * FAIL * FAIL
with Resources
Milestones
9 0.00% 0.00% PASS 0.00% PASS * FAIL 0.00% PASS 0.00% PASS
with Duration

PS.1734.11
Copyright © AACE® International.
This paper may not be reproduced or republished without expressed written consent from AACE® International
2014 AACE® INTERNATIONAL TECHNICAL PAPER

Start and Finish


10 2 1 FAIL 2 PASS 2 PASS 2 PASS 2 PASS
Milestones
11 High Duration 5.00% 4.53% PASS 2.00% PASS 1.43% PASS 2.93% PASS 0.36% PASS
Extreme
12 0.00% 0.61% FAIL 1.08% FAIL 0.00% PASS 0.00% PASS 0.00% PASS
Duration
Project
13 Y y PASS Y PASS y PASS y PASS y PASS
Calendar
14 Holidays Y y PASS Y PASS y PASS y PASS y PASS
Basic
15 Relationship 5.00% 0.71% PASS 4.01% PASS 16.43% FAIL 3.29% PASS 6.00% FAIL
(Open Ends)
Relationship
16 Type (Finish to 90.00% 92.81% PASS 94.68% PASS 85.09% FAIL 91.11% PASS 79.59% FAIL
Start)
Critical Path
Test
17 Y y PASS n FAIL y PASS y PASS y PASS
(Horizontal
Traceability)
Activities on
18 15~20% 32.87% FAIL 4.03% FAIL 45.00% FAIL 5.88% FAIL 0.00% FAIL
Critical Path
Link in
Summary
19 Activity/Hamm 3.00% 0.08% PASS 1.60% PASS 0.00% PASS 0.00% PASS 0.00% PASS
ock/Level of
effort
Hard
20 5.00% 0.00% PASS 0.03% PASS 0.00% PASS 0.00% PASS 0.18% PASS
Constraints
21 Constraints % 10%, 15% 0.46% PASS 1.25% PASS 0.71% PASS 0.98% PASS 2.00% PASS
Resources
22 Rate/Prices 100.00% 0.00% FAIL 0.00% FAIL 0.00% FAIL 0.00% FAIL 0.00% FAIL
Assigned
Resource
23 Library/Diction Y n FAIL n FAIL n FAIL n FAIL 0.00% FAIL
ary
24 High Float 5.00% 31.16% FAIL 10.83% FAIL 14.29% FAIL 33.88% FAIL 21.17% FAIL
25 Extreme Float 0.00% 11.13% FAIL 4.56% FAIL 11.43% FAIL 0.00% PASS 7.66% FAIL
26 Lags 5.00% 7.03% FAIL 5.88% FAIL 0.00% PASS 5.11% FAIL 15.11% FAIL
27 Lead 0.00% 0.08% FAIL 0.03% FAIL 0.00% PASS 2.44% FAIL 12.90% FAIL
PASS 12 PASS 14 PASS 13 PASS 14 PASS 15
FAIL 15 FAIL 13 FAIL 14 FAIL 13 FAIL 12
Table 3 – Schedule Quality Analysis Result (Group B)

The average schedule performance (% delay) of Group A was 6%. On average, among all 27
measurements, Group A passed 18.7 thresholds (69.2%); Group B passed 13.6 (50.4%). The
average schedule performance (% delay) of Group B was 50%. Unfortunately, it is difficult to
claim statistical significance due to the small number of schedules. However, the authors were

PS.1734.12
Copyright © AACE® International.
This paper may not be reproduced or republished without expressed written consent from AACE® International
2014 AACE® INTERNATIONAL TECHNICAL PAPER

able to identify a relationship between a baseline schedule quality and the metrics identified in
the referenced literature. From threshold screening analysis by group, the authors identified
that Group A (better schedule performance group) had higher baseline schedule quality
compared to Group B (poor schedule performance group).

As presented in Tables 2 and 3, there were several industry metrics that were within and
outside the published thresholds for Group A and B. The following industry metrics were within
the published thresholds (passed) for both Group A and Group B:

 unique activity ID
 project calendar
 holiday list
 link in Summary Activity / Hammock / Level of effort
 number of hard constraints
 number of general constraints.

Although these metrics were within the published thresholds for Groups A & B, they may not
definitively determine schedule quality. Adjustments to the thresholds may be required to
determine if a metric has an impact on schedule quality.

In addition, the following industry metrics were outside the published thresholds for both
Group A and B:

 responsibility / organizational / functional codes


 activities assigned with resource rate or price
 activities with total float greater than 44 working days

Similarly, the thresholds for the above metrics may need to be adjusted. Failure to fall within
the threshold for these metrics may not determine schedule performance.

From this analysis, the authors were able to develop a framework for reviewing and analyzing
schedule-quality metrics. The existence of a lead in the baseline schedule, utilization of
resource library/dictionary, ratio of activities with basic relationships (a predecessor and a
successor -no open ends), and ratio of relationship type were good indicators of the quality of
baseline schedules.

Needs for Future Research

Additional research and analysis is recommended for the future. Here are a few considerations
based on the current research and analysis performed to date:

PS.1734.13
Copyright © AACE® International.
This paper may not be reproduced or republished without expressed written consent from AACE® International
2014 AACE® INTERNATIONAL TECHNICAL PAPER

1. Additional case study projects are required. A substantial effort is needed to initiate
research through analyzing a large number of baseline schedules to develop effective
quantitative schedule quality metrics and thresholds.
2. Research is needed on setting the proper threshold for each metric. Based on the
literature review, Industry Metrics appear to be somewhat subjective, rule-of-thumb, or
based on the experience of the organizations/authors; the metrics are not based on
substantiated evidence. Furthermore, the referenced metrics did not assess the impact
of the metrics on the quality of the schedule mechanics. Furthermore, no research or
data was found from the literature review to support the publication's recommended
metrics.

Conclusions

The purpose of this paper was to identify and evaluate a list of industry-recognized metrics to
determine if a schedule is fatally flawed or represents a reasonable forecast. Current Industry
Metrics were analyzed, filtered, and compared for different case study projects. While
additional projects are needed to improve the statistical size, the following observations were
noted based on the findings:

1. The case study projects with higher baseline schedule quality had less of a schedule
delay (Group A). Group A had approximately 19% more Industry Metrics within the
thresholds (passed) and experienced 44% less delay.
2. Schedules with milestones that had an adequate ratio of detailed activities, proper
relationships ties, and contained no resource assignments experienced better schedule
performance.
3. Existence of "leads" in the baseline schedule, utilization of resource library/dictionary,
ratio of activities with basic relationships (a predecessor and a successor - no open ends),
and ratio of relationship type were good indicators for checking the quality of baseline
schedules.
4. All the case study projects were successful in Industry Metrics such as Unique activity ID,
Project calendar, Holiday list, Link in Summary Activity / Hammock / Level of effort,
Number of hard constraints, and Number of general constraints. In addition, all the case
study projects were unsuccessful in the metrics of Responsibility / organizational /
functional codes, Activities assigned with resource rate or price, and Activities with total
float greater than 44 working days. These metrics, which most of schedules
passed/failed, might not be an outstanding tool to determine schedule quality. To
improve a metric’s measurability of schedule quality, adjustments to the recommended
thresholds are needed.

PS.1734.14
Copyright © AACE® International.
This paper may not be reproduced or republished without expressed written consent from AACE® International
2014 AACE® INTERNATIONAL TECHNICAL PAPER

References

1. AACE, 2011, Total Cost Management Framework: In Integrated Approach to Portfolio,


Program, and Project Management, First Edition, AACE International.
2. DCMA (Defense Contract Management Agency), U.S. Department of Defense, 2012, Earned
Value Management System (EVMS) Program Analysis Pamphlet (PAP), U.S. Department of
Defense
3. U.S. Department of Defense, 2003, The Program Managers' Guide to the Integrated
Baseline Review Process, Office of the Under Secretary of Defense
4. U.S. Department of Defense, 2005, Integrated Master Plan and Integrated Master Schedule
Preparation and Use Guide, U.S. Department of Defense
5. U.S. Department of Defense, 2012, Over Target Baseline and Over Target Schedule Guide,
OUSD AT&L (PARCA)
6. GAO, 2012, GAO Schedule Assessment Guide: Best Practices for Project Schedules, U.S.G.A.
Office
7. GAO, 2009, GAO Cost Estimating and Assessment Guide: Best Practices for Developing and
Managing Capital Program Costs, DIANE Publishing.
8. O'Brien, J.J. and F.L. Plotnick, 2010, CPM in Construction Management, 7 ed., McGraw-Hill.
9. NASA, 2010, Schedule Management Handbook, National Aeronautics and Space
Administration: NASA Center for AeroSpace Information
10. Naval Air Systems Command Program Success Orientation Team, 2010, Integrated Master
Schedule (IMS) Guidebook, NAVAIR
11. NDIA, 2012, Planning & Scheduling Excellence Guide (PASEG), National Defense Industrial
Association.
12. NDIA, 2011, Earned Value Management Systems Intent Guide, National Defense Industrial
Association
13. NDIA, 2005, ANSI/EIA-748-A Standard for Earned Value Management Systems Intent Guide,
National Defense Industrial Association
14. PMI, 2007, Practice Standard for Scheduling, Project Management Institute
15. University of Texas: Office of Facilities Planning and Construction, 2011, Design and
Construction Standard-Division 01 General Requirement, Specification Section 01 32 00 -
Project Planning and Scheduling, University of Texas: Office of Facilities Planning and
Construction.

PS.1734.15
Copyright © AACE® International.
This paper may not be reproduced or republished without expressed written consent from AACE® International
2014 AACE® INTERNATIONAL TECHNICAL PAPER

Jin Ouk Choi


The University of Texas at Austin
[email protected]

Anthony J. Gonzales
Spire Consulting Group LLC
[email protected]

PS.1734.16
Copyright © AACE® International.
This paper may not be reproduced or republished without expressed written consent from AACE® International

You might also like