0% found this document useful (0 votes)
337 views

SoftwareProjectManagementPlan Final SWE625

The document is a revision history for the Software Project Management Plan (SPMP) for the Automated Weapons Accountability and Tracking System (AWATS) project created by Team Hoodwick. It lists 8 revisions made between September 2006 and December 2006. The revisions updated sections on the project overview, requirements, schedule, risks, and work breakdown structure. The final revision was version 1.0 submitted on December 4, 2006.

Uploaded by

Jaun Varva
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
337 views

SoftwareProjectManagementPlan Final SWE625

The document is a revision history for the Software Project Management Plan (SPMP) for the Automated Weapons Accountability and Tracking System (AWATS) project created by Team Hoodwick. It lists 8 revisions made between September 2006 and December 2006. The revisions updated sections on the project overview, requirements, schedule, risks, and work breakdown structure. The final revision was version 1.0 submitted on December 4, 2006.

Uploaded by

Jaun Varva
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 237

Automated Weapons Accountability and

Tracking System (AWATS)

Team Hoodwick

Team Members
Lina Ciarleglio
Tuyen Dam
Matt Henry
Leonard Woody III

Revision 1.0

George Mason University


Volgenau School of Information Technology and Engineering
Department of Information and Software Systems Engineering

SWE 625 – Software Project Management


Professor Ken Nidiffer

Submission Date: 12/04/2006


Revisions
Stage Revision Date Section Description Section Remarks
Number Number Updated
1 0.1 09/12/2006 Front Project Scope
Matter
1 0.1 09/16/2006 1.1 Project Overview
1 0.1 09/16/2006 1.1.1 Project Description
1 0.1 09/16/2006 1.1.2 Product Summary
1 0.1 09/16/2006 1.1.3 Project Summary
1 0.1 09/16/2006 1.2 Project
Deliverables
1 0.1 09/16/2006 1.2.1 Software
Application, CSCI
1 0.1 09/16/2006 1.2.2 Delivery Location
& Quantities
1 0.1 09/16/2006 1.2.3 Documentations
1 0.1 09/16/2006 1.3 Evolution of the
SPMP
2 0.2 09/25/2006 3.1 Management
Objectives
2 0.2 09/25/2006 3.1.1 Goals & Objectives
2 0.2 09/25/2006 3.1.2 Management
Priorities
2 0.2 09/25/2006 3.2 Assumptions,
Dependencies, and
Constraints
2 0.2 09/25/2006 3.2.1 Assumptions
2 0.2 09/25/2006 3.2.2 Dependencies
2 0.2 09/25/2006 3.2.3 Constraints
2 0.2 09/25/2006 5.3 Resource Estimate
2 0.2 9/25/2006 5.5 Schedule
Estimate
3 0.3 10/2/2006 2.1 Process
Model
3 0.3 10/2/2006 2.1.1 Process Milestones
3 0.3 10/2/2006 2.1.2 Baseline
3 0.3 10/2/2006 2.1.3 Reviews
3 0.3 10/1/2006 2.2 Organizational
Structure
3 0.3 10/1/2006 2.3 Organizational
Boundaries and
Interfaces
3 0.3 10/1/2006 2.4 Project
Responsibilities
3 0.3 10/1/2006 2.4.1 Project Manager
3 0.3 10/1/2006 2.4.2 Assistant Project
Manager
3 0.3 10/1/2006 2.4.3 Chief Programmer
3 0.3 10/1/2006 2.4.4 Secretary
3 0.3 10/1/2006 2.4.5 Systems
Engineer/Analyst

ii
Stage Revision Date Section Description Section Remarks
Number Number Updated
3 0.3 10/1/2006 2.4.6 Requirements
Analyst
3 0.3 10/1/2006 2.4.7 Technical Team
Leader
3 0.3 10/1/2006 2.4.8 Programmer
3 0.3 10/1/2006 2.4.9 Tester
3 0.3 10/1/2006 2.4.10 Help Desk
Technician
3 0.3 10/1/2006 2.4.11 RFID Integration
Manager
3 0.3 10/1/2006 2.4.12 RFID Engineer
3 0.3 10/1/2006 2.4.13 Project Specialists
3 0.3 10/1/2006 1.1 Project Overview 1.1.1 Added System
Architecture
Diagram
4 0.4 10/8/2006 4.1 Methods, Tools,
and Techniques
4 0.4 10/8/2006 Appendix Enterprise
B Architecture
Framework
4 0.4 10/8/2006 Total Total Costs
Costs
4 0.4 10/8/2006 Appendix Tools Costs
C
4 0.4 10/8/2006 5.3 Resource Estimate 5.3 Updated
resources
4 0.4 10/8/2006 2.4 Project 2.4 Updated
Responsibilities Responsibility
Matrixes
4 0.4 10/13/2006 2.4 Project 2.4 Updated
Responsibilities Responsibility
Matrixes
4 0.4 10/13/2006 Appendix Documentation
D Process
4 0.4 10/13/2006 Appendix Product Assurance
E TOC
4 0.4 10/16/2006 4.2 Software
Documentation
4 0.4 10/20/2006 3.5 Staffing Plan
4 0.4 10/20/2006 Total Total Costs Total Updated Total
Costs Costs Costs
4 0.4 10/20/2006 4.3.1 Configuration
Management
4 0.4 10/20/2006 4.3.2 Quality Assurance
4 0.4 10/20/2006 4.3.3 Verification and
Validation
4 0.4 10/20/2006 4.3.4 Test and Evaluation
4 0.4 10/20/2006 4.3.5 Information
Systems Security
4 0.4 10/20/2006 2.1.1 Process Milestones 2.1. Updates Process
Milestones
5 0.5 10/25/2006 3.4 Monitoring and

iii
Stage Revision Date Section Description Section Remarks
Number Number Updated
Controlling
Mechanisms
5 0.5 10/25/2006 3.4.1 Schedule and
Budget
5 0.5 10/25/2006 3.4.2 Quality Assurance
5 0.5 10/25/2006 3.4.3 Productivity
6 0.6 11/13/2006 5.0 Work Packages,
Resource
Requirements,
Budget Allocation,
and Schedule
6 0.6 11/13/2006 5.1 Work Packages
6 0.6 11/13/2006 5.1.1 Work Package
Specifications
Example
6 0.6 11/13/2006 5.2 Dependencies
6 0.6 11/13/2006 5.4 Budget & Resource
Allocation
6 0.6 11/13/2006 1.1 Project Overview 1.1.1 Added Computer
Architecture
Diagram
6 0.6 11/13/2006 3.5 Staffing Plan 3.5.4 Updated the
tables and figures
for Phasing out
of Personnel
6 0.6 11/13/2006 5.5 Project Schedule 5.5 Updated MS
Project schedule
figures
7 0.7 11/20/2006 3.3 Risk Management
7 0.7 11/20/2006 1.4 Reference Materials
7 0.7 11/20/2006 1.5 Definitions and
Acronyms
7 0.7 11/20/2006 3.4.4 Progress
8 1.0 11/22/2006 Appendix I Complete Work Appendix Updated WBS
Breakdown I Specifications
Structure
8 1.0 12/1/2006 1.1.3.3 Work Detail
Summary
8 1.0 12/1/2006 Cost Certainty
8 1.0 12/1/2006 Index Index
8 1.0 12/1/2006 Appendice Appendices G-K Inserted Cost
s G –K Certainty into
Appendix G;
reordered all
other appendices.
8 1.0 12/1/2006 5.1 Work Breakdown 5.1 Inserted
Structure explanation on
WBS
computation
8 1.0 12/1/2006 5.2 Dependencies 5.2 Added Resource
Allocation graph
for an employee

iv
Table 1 – Revisions

v
Preface
Maintaining an accurate inventory of weapons is a critical goal for any armory. Failure to
do so can result in potentially dangerous situations, in which military-caliber weapons are
not securely stored after use. Even today, with a myriad of technological options,
mistakes occur, due in large part to human error. When inventory time comes, weapons
inevitably end up missing or simply misplaced. The end result wastes time and money in
order to locate or replace the firearm. These issues are common shortcomings of current
armory asset management systems.

The Automated Weapon Accountability and Tracking System (AWATS) is envisioned to


address these problems. The end result of this project aims to greatly reduce or even
eliminate clerical error when issuing firearms from armories. This will assist those
responsible for the weapons to better manage their assets, by eliminating paper or other
inadequate systems from the procurement process. Use of AWATS will also allow
restrictions to be placed on a user’s access to certain types of firearms. Accurate
procedure is vital to the weapons procurement process, and AWATS ensures that
procedure is followed and that a record of all assets issued or collected from the armory is
kept.

AWATS has the potential to drastically reduce armory budgets by providing real-time
inventories. If a weapon is missing, it will be known in a short period of time, thus
improving the chances of finding it (since the system will always know who last checked
out the weapon).

The AWATS system will be developed by extending the System XYZ software packages
to support:
 Asset Management
 Asset Accountability
 Role-Based Security
 Reporting

These extensions will be built using the following packages:


 General Purpose Database Package
 Compile/Link/Runtime Package
 Project Management Package
 Spreadsheet Package
 Requirement Management Package
 GPS Navigation Package
 Electronic Inventory and Tracking Package
 Communications Package
 Word Processing Package
 Debugging/Testing Package
 Graphical Presentation Package

vi
The development schedule of the 11 AWATS software extensions is 24 months. The full
set of packages will be in final Alpha testing on or before Aug 30, 2008. Two months of
Alpha-testing will be followed by two months of Beta-testing, ultimately leading to a two
month rollout. Six months of maintenance will be provided, with the option to continue
support in yearly increments afterward.

vii
Abstract
The function of the Software Project Management Plan (SPMP) is to act as a controlling
document for managing the Automated Weapon Accountability and Tracking System
(AWATS) by defining the technical and managerial processes to complete all project
requirements. This document contains project organization, managerial processes,
technical processes, work packages, scheduling, and budget considerations for AWATS.

This is a living document which actively reflects the current planning state of AWATS.
As such, it should be maintained and referenced by all personnel assigned to this project.

viii
Total Cost

Resource Cost Remarks


Documentation Costs: $626,250.00 See Appendix B
Hardware Costs: $11,025,000.00 See Appendix C
Helpdesk Costs: $2,162,000.00 See Appendix D
Software Package Costs: $23,606,400.00 See Appendix E
Software Tool Costs: $389,046.00 See Appendix F
System Kernel Software: $3,000,000.00 Assume 20K SLOC
     
Total Project Cost: $40,808,696.00  
Table 2 – Total Costs

ix
Project Charter

Project Sponsor Information


Sponsor Name: Title: Vice President, New Software and Systems
Henry "Hap" Arnold Engineering Development
Business Need: The Automated Weapon Accountability and Tracking
System (AWATS) will help ensure that all weapons used
by the USMC are properly accounted for through the use
of passive RFID tags.
Business Benefits: The amount of weapons that are lost, misplaced, or stolen
will be dramatically decreased. The accountability of who
has weapons will be increased and human error will be
decreased.
Major Deliverables
Product(s) or Service(s)
Asset Management System Asset Accountability System
Role-Based Security Reporting
Schedule Constraints and Assumptions
Planned Start Date: Planned End Date: August 30, Latest End Date:
August 28, 2006 2008
Schedule Assumptions: AWATS will be developed and provide internal product
releases during the next 24 months with a full Alpha test
on or before August 30, 2008.
Schedule Constraints: Schedule allocation is as follows:
24 months of AWATS development
2 months Alpha test at Fairfax, VA
2 months Beta test in Chelmsford, MA
2 months rollout to customer sites
6 months maintenance
Key Staffing Requirements
Project Manager: Title: Senior Project Manager Date Available:
William Wallace 9/1/2006
Status:  
Other Key Staff: Title: Tech Lead Date Available:
James "Jimmy" Doolittle 9/1/2006
Status:
Other Constraints and Assumptions
Constraints: AWATS software will extend the System XYZ software.
AWATS will extend the 11 existing System XYZ
packages. Three modules will be extended through COTS
packages, three will extend reusable portions of software,
and the remaining five modules will be custom written.
One of the five custom modules will be outsourced to
Ivan Industries.
Project Scope
The AWATS project will provide customers the capability to automatically account for
and track weapon assignments, provide real time asset management paired with role
based security, and reporting functionality. The AWATS project will use passive RFID
tags, fixed RFID interrogators, and 2D barcode scanners to provide the customer with
increased efficiency and reliability in enforcing weapons accountability and inventory
management.

The products and services to be provided by the AWATS project include Asset
Management, Asset Accountability, Role Based Security, and Reporting capabilities.

Exclusions to the AWATS project include any hardware installations and delivery that
are necessary for the AWATS software to function; detailed documentation will be
provided for hardware installation. The AWATS project will provide for asset
management and accountability based on a weapon’s transaction history.

AWATS Hardware Configuration (Notebook Configuration)


 Two Universal 2006-A Microprocessors, 3.2 GHz
 17 inch display
 A three-button mouse point device
 2 GB of main memory (Synchronous Dynamic RAM)
 16 megabyte Video RAM
 3.5 inch floppy drive
 DVD/R/RW megabyte CD-RW Combo Drive
 A 100 megabyte ZIP drive
 Printer port
 Asynchronous port
 Integrated 802.11g wireless LAN
 Three USB ports
 1394/FireWire connector
 A LAN interface card
 A 56,000 bps capable fax/modem

AWATS Bundled Devices


 Speakers
 Laser Printer
 8-cell lithium ion battery
 A Bar Code scanner (assumption: scanner can read 2 –D bar codes)
 Personal Assistant Device (131 megahertz, 32 MB RAM)
 36-Bit Color Flatbed Scanner (600 dpi optical resolution)
 System ABC enhanced keyboard
 A CRT monitor (1,280 X 1,024 non-interlaced; high resolution; bit-mapped; 21 inch
color display)

xi
 Port replicator
 A stand for the CRT monitor
 Power connector
 Digital camcorder
 Internal speakers
 Wireless digital phone with voice mail messaging and internet service
 4 ISBN ports
 RFID interrogators
 RFID tags

AWATS Application Packages


 General Purpose Database Package
 Spreadsheet Package
 Configuration & Requirements Management Package
 Communication Package
 Graphics Presentation Package
 Word Processing Package
 Project Manager’s Package
 GPS Navigation Package
 Compile/Link/Runtime Packages for JAVA
 Language Independent Debugging & Testing Package
 Electronic Inventory & Tracking Package

xii
Table of Contents
Revisions.............................................................................................................................ii
Preface.................................................................................................................................v
Abstract..............................................................................................................................vii
Total Cost.........................................................................................................................viii
Project Charter....................................................................................................................ix
Project Scope.......................................................................................................................x
Table of Contents..............................................................................................................xii
List of Tables.....................................................................................................................xv
List of Figures...................................................................................................................xvi
List of Figures...................................................................................................................xvi
1.0 Introduction....................................................................................................................1
1.1 Project Overview.......................................................................................................1
1.1.1 Project Description.............................................................................................1
1.1.2 Product Summary...............................................................................................6
1.1.3 Project Summary..............................................................................................12
1.2 AWATS Project Deliverables..................................................................................15
1.2.1 Software Applications, Computer Software Configuration Item (CSCI).........15
1.2.2 Delivery Locations and Quantities...................................................................16
1.2.3 Documentation..................................................................................................16
1.3 Evolution of the Software Project Management Plan..............................................17
1.4 Reference Materials.................................................................................................17
1.4.1 RFID Standards................................................................................................17
1.4.2 Software Project Management Plan..................................................................18
1.5 Definitions and Acronyms.......................................................................................18
1.5.1 Definitions........................................................................................................18
1.5.2 Acronyms..........................................................................................................22
2.0 Project Organization....................................................................................................24
2.1 Process Model..........................................................................................................24
2.1.1 Process Milestones............................................................................................25
2.1.2 Baseline.............................................................................................................30
2.1.3 Reviews.............................................................................................................30
2.2 Organizational Structure..........................................................................................33
2.3 Organizational Boundaries and Interfaces...............................................................37
2.4 Project Responsibilities...........................................................................................39
2.4.1 Project Manager................................................................................................39
2.4.2 Assistant Project Manager................................................................................39
2.4.3 Chief Programmer............................................................................................39
2.4.4 Secretary...........................................................................................................39
2.4.5 Systems Engineer/Analyst................................................................................39
2.4.6 Requirements Analyst.......................................................................................39
2.4.7 Technical Team Leader....................................................................................39
2.4.8 Programmer......................................................................................................39
2.4.9 Tester................................................................................................................40

xiii
2.4.10 Help Desk Technician.....................................................................................40
2.4.11 RFID Integration Manager.............................................................................40
2.4.12 RFID Engineer................................................................................................40
2.4.13 Project Specialists...........................................................................................40
3.0 Managerial Process......................................................................................................53
3.1 Management Objectives and Priorities....................................................................53
3.1.1 Goals and Objectives........................................................................................53
3.1.2 Management Priorities......................................................................................54
3.2 Assumptions, Dependencies, and Constraints.........................................................55
3.2.1 Assumptions.....................................................................................................55
3.2.2 Dependencies....................................................................................................55
3.2.3 Constraints........................................................................................................56
3.3 Risk Management....................................................................................................56
3.4 Monitoring and Controlling Mechanisms................................................................58
3.4.1 Schedule and Budget........................................................................................58
3.4.2 Quality Assurance.............................................................................................58
3.4.3 Productivity.......................................................................................................59
3.4.4 Progress.............................................................................................................66
3.5 Staffing Plan............................................................................................................68
3.5.1 Obtaining Personnel..........................................................................................68
3.5.2 Training.............................................................................................................69
3.5.2 Retaining...........................................................................................................69
3.5.4 Phasing out of Personnel..................................................................................69
4.0 Technical Process........................................................................................................93
4.1 Methods, Tools, and Techniques.............................................................................93
4.2 Software Documentation.........................................................................................96
4.3 Project Support Functions........................................................................................99
4.3.1 Configuration Management..............................................................................99
4.3.2 Quality Assurance.............................................................................................99
4.3.3 Verification and Validation............................................................................100
4.3.4 Test and Evaluation........................................................................................100
4.3.5 Information Systems Security.........................................................................101
5.0 Work Packages, Resource Requirements & Estimations, Budget, and Schedule.....102
5.1 Work Packages......................................................................................................102
5.1.1 Work Packages Specifications Example*......................................................121
5.2 Dependencies.........................................................................................................122
5.3 Resource Requirements.........................................................................................125
5.4 Budget and Schedule Allocation...........................................................................128
5.5 Project Schedule....................................................................................................138
Appendix A: Zachman Enterprise Architecture Framework...........................................144
Appendix B: Documentation Costs.................................................................................145
Appendix C: Hardware Costs..........................................................................................146
Appendix D: Helpdesk Costs...........................................................................................147
Appendix E: Software Package Costs.............................................................................148
Appendix F: Software Tool Costs...................................................................................149
Appendix G: Cost Certainty............................................................................................151

xiv
Appendix H: Documentation Process..............................................................................152
Appendix I: Product Assurance Documentation TOC....................................................153
Appendix J: Complete Work Breakdown Structure (WBS)............................................155
Appendix K: COCOMO II Complete Output..................................................................166
Index................................................................................................................................213

xv
List of Tables
Table 1 – Revisions............................................................................................................iv
Table 2 – Total Costs........................................................................................................viii
Table 3 – Delivery Locations & Quantities.......................................................................16
Table 4 – Documentation...................................................................................................16
Table 5 – Evolution of the Software Project Management Plan.......................................17
Table 6 – RFID Standards.................................................................................................17
Table 7 – Acronym Listing................................................................................................23
Table 8 – Activities, Benchmarks, and Milestones...........................................................29
Table 9 – Database Responsibility Matrix.........................................................................42
Table 10 – Spreadsheet Responsibility Matrix..................................................................43
Table 11 – Requirements Management Responsibility Matrix.........................................44
Table 12 – Communication Responsibility Matrix...........................................................45
Table 13 – Graphics Presentation Responsibility Matrix..................................................46
Table 14 – Word Processing Responsibility Matrix..........................................................47
Table 15 – Project Management Responsibility Matrix....................................................48
Table 16 – GPS Navigation Responsibility Matrix...........................................................49
Table 17 – Compile/Link/Runtime Responsibility Matrix................................................50
Table 18 – Language Independent Debugging & Testing Responsibility Matrix.............51
Table 19 – Electronic Inventory & Tracking Responsibility Matrix.................................52
Table 20 – Risk Mitigation Strategy..................................................................................57
Table 21 – Productivity Review Schedule.........................................................................61
Table 22 – Group’s Appropriate Metrics...........................................................................62
Table 23 – Class vs. Phase Measurements........................................................................64
Table 24 – Project Measures Matrix..................................................................................65
Table 25 – Metrics Set for AWATS Project......................................................................66
Table 26 – Staff, Expertise, Recruitment, and Utilization Sheet.......................................71
Table 27 – AWATS Staffing (September 2006 – January 2008)......................................75
Table 28 – AWATS Staffing (February 2008 – November 2008)....................................78
Table 29 - Software Documentation..................................................................................98
Table 30 - Software Documentation................................................................................120
Table 31 – Resource Requirements (Part I).....................................................................125
Table 32 – Resource Requirements (Part II)...................................................................126
Table 33 – Resource Requirements (Part III)..................................................................127
Table 34 – Budget & Schedule Allocation......................................................................133
Table 35 – Resource Allocation (Part I)..........................................................................134
Table 36 – Resource Allocation (Part II).........................................................................135
Table 37 – Resource Allocation (Part II).........................................................................136
Table 38 – Zachman Enterprise Architecture Framework..............................................144
Table 39 – Documentation Costs.....................................................................................145
Table 40 – Hardware Costs..............................................................................................146
Table 41 – Helpdesk Costs..............................................................................................147
Table 42 – Software Package Costs.................................................................................148
Table 43 – Software Tool Costs......................................................................................150

xvi
xvii
List of Figures
Figure 1 – System Architecture Diagram............................................................................3
Figure 2 – Computer Architecture Diagram........................................................................4
Figure 3 –ADR/Armory Use Case Diagram........................................................................5
Figure 4 – Process Model Milestone Chart.......................................................................25
Figure 5 - Process Model Milestone Chart (continued)....................................................26
Figure 6 – Corporation Structure.......................................................................................33
Figure 7 – Vice-President Software Division Structure....................................................34
Figure 8 – Project Management Structure.........................................................................35
Figure 9 – Project Team Structure.....................................................................................36
Figure 10 – Program Manager Organizational Interfaces.................................................37
Figure 11 – Project Manager Organizational Interfaces....................................................38
Figure 12 – Progress Tracking...........................................................................................67
Figure 13 - Progress Indicator Example............................................................................68
Figure 14 - Database Staffing Over Time..........................................................................79
Figure 15 - Spreadsheet Staffing Over Time.....................................................................80
Figure 16 - Requirements Management Staffing Over Time............................................81
Figure 17 - Secure Communications Staffing Over Time.................................................82
Figure 18 - Graphics Presentation Staffing Over Time.....................................................83
Figure 19 - Word Processing Staffing Over Time.............................................................84
Figure 20 - Project Management Staffing Over Time.......................................................85
Figure 21 - GPS Navigation Staffing Over Time..............................................................86
Figure 22 - Compile/Link/Runtime Staffing Over Time...................................................87
Figure 23 - Debugging/Testing Staffing Over Time.........................................................88
Figure 24 - Electronic Inventory Staffing Over Time.......................................................89
Figure 25 - Management Staff Over Time........................................................................90
Figure 26 - Overall Staff Over Time.................................................................................91
Figure 27 - Overall Staff by Application Over Time........................................................92
Figure 28 - Work Breakdown Structure: AWATS all 11 packages................................103
Figure 29 - Work Breakdown Structure: General Purpose Database Package (COTS)..104
Figure 30 - Work Breakdown Structure: Compile/Link/Runtime Package (COTS).......105
Figure 31 - Work Breakdown Structure: Project Management Package (COTS)...........106
Figure 32 - Work Breakdown Structure: Spreadsheet Package (Reuse).........................107
Figure 33 - Work Breakdown Structure: Requirement Management Package (Reuse). .108
Figure 34 - Work Breakdown Structure: GPS Navigation Package (Reuse)..................109
Figure 35 - Work Breakdown Structure: Electronic Inventory & Tracking Package
(Custom)..........................................................................................................................110
Figure 36 - Work Breakdown Structure: Communications Package (Custom)..............111
Figure 37 - Work Breakdown Structure: Word Processing Package (Custom)..............112
Figure 38 - Work Breakdown Structure: Debugging/Testing Packages (Custom).........113
Figure 39 – Work Breakdown Structure: Graphical Presentation Packages (Outsourced)
.........................................................................................................................................114
Figure 40 – Dependencies (Part I)...................................................................................122
Figure 41 – Dependencies (Part II)..................................................................................123

xviii
Figure 42 – Example Resource Allocation (Matt Henry, October 2007 – January 2008)
.........................................................................................................................................137
Figure 43 - Schedule Estimate - Overview......................................................................138
Figure 44 - Schedule Estimate - Detail (Part I)...............................................................139
Figure 45 - Schedule Estimate - Detail (Part II)..............................................................140
Figure 46 - Schedule Estimate - Detail (Part III).............................................................141
Figure 47 - Schedule Estimate - Detail (Part IV)............................................................142
Figure 48 - Schedule Estimate – Resource List Detail....................................................143
Figure 49 - Documentation Process.................................................................................152

xix
1.0 Introduction
1.1 Project Overview
1.1.1 Project Description

The AWATS project is an ambitious Radio Frequency Identification (RFID) asset


management solution built for the United States Marine Corps (USMC). The AWATS
system is envisioned to provide a solution capable of enforcing proper issuance/collection
procedures of weapons from armories, through the use of passive RFID tags, fixed RFID
interrogators, and 2D barcode scanners (used to scan CAC cards). The system is
configurable to recognize specific users and what types of weapons he or she may
procure. AWATS will provide up-to-the-moment inventory and report generation
capabilities for any armory currently using the system. The ultimate goal of AWATS is to
provide customers with a secure and user transparent mechanism for enforcing weapon
accountability and maintaining inventory.

The AWATS project will be composed of many tasks, drawing from technical and
managerial process definitions, project organization, work package delivery, scheduling
and budget monitoring. At the conclusion of the period of performance, AWATS will
enhance and automate the capabilities of the current System XYZ in order to inventory
weapon assets for the Marines. In order for AWATS to be a successful project,
milestones must be reached on time and to expectation (of both client and contractor).
Weapon mounting options for the passive RFID tags must be investigated and a viable
solution found (while maintaining the schedule). Additionally, RFID interrogator
mounting options must be determined for various armory layouts. Delivery of the 750
advanced orders must proceed without delay or additional cost to the customer.
Additionally, ABC must be prepared to fabricate additional orders in a timely manner,
should the customer wish to order more units in the near future. With these goals
established, the corporation can expect a healthy profit margin and the monetary stability
to pursue enhancements to the system in order to attract additional clients.

The extension of System XYZ will involve additions to the following packages:
 General Purpose Database Package (COTS)
 Compile/Link/Runtime Package (COTS)
 Project Management Package (COTS)
 Spreadsheet Package (Reuse)
 Requirement Management Package (Reuse)
 GPS Navigation Package (Reuse)
 Electronic Inventory and Tracking Package (Custom Developed)
 Communications Package (Custom Developed)
 Word Processing Package (Custom Developed)
 Debugging/Testing Package (Custom Developed)
 Graphical Presentation Package (Outsourced to Ivan Industries)

1
Unique challenges must be overcome for this solution to be successful. In the end, the
communications channels between AWATS clients and the AWATS Data Repository
(ADR) will be secured through encryption, and hardware failure scenarios will be fully
explored (with appropriate and cost-effective resolutions documented). Points of failure
will be eliminated where possible. In the event of failure, redundancy will be used (where
cost is not prohibitive) to ensure that the end-user is not affected while the primary
system is made operational.

There are enormous implications for extending the AWATS system. The data contained
within the ADR will be available via a secure web service. This will allow the
interoperability of AWATS with external and existing systems. Additionally, with the use
of passive RFID tags mounted on weapons, field inventory systems may also be
developed in remote locations to provide further asset accountability and visibility.

AWATS will be developed using the waterfall software life cycle. Best practices from the
industry will be used in each phase of development. Where possible, code will be re-used
to cut down on development costs.

2
Figure 1 – System Architecture Diagram

3
Monitor (15.4" Display)

AGP Bus

3.5" Floppy AGP GPU


USB Ports (x3)
Drive (64MB VRAM)

CD/DVD/R/RW AGP Bus Serial Bus


40 Pin Cable
Combo Drive
AGP Bus Hub USB Chipset
100MB Zip
Drive

Hard Disk Drive


(60GB) Printer Port

80 Pin Cable Asynchronous


Port
Floppy
IDE Controller
Controller Wireless LAN
AGP Bus (802.11g)

1394/Firewire
PCI Bus
Connector

LAN NIC
PCI Bus Hub PCI Bus

PCI Bus 56K Modem

AGP Bus Hub


2006-A
Microprocessor
AGP Bus
(3.2GHz)

1GB SDRAM
2006-A
Microprocessor
1GB SDRAM
(3.2GHz)

Memory Bus

Figure 2 – Computer Architecture Diagram

4
Figure 3 –ADR/Armory Use Case Diagram

5
1.1.2 Product Summary

1.1.2.1 Statement

AWATS is an automated client and communication system that will be used for securing
Marine armory assets, primarily rifles such as M-14s, M-16A2s and M-4s. These systems
are for constantly performing the tracking and accountability of every single weapon
stored in the armories at any locations in the United States and any American military
bases or facilities worldwide.

The objective is moving the weapon controls process away from dependence on large
volumes of paper documents and human memory, which can create erroneous results.
This transaction (once complete) will also benefit quick deployment ambitions. Therefore
it also meets the goals of moving to automated systems mandated by the General Services
Administration’s (GSA) federal guidelines #89893 authorized by presidential Executive
Order #38398 of streamlining the government’s bureaucratic agencies. More importantly,
the AWATS can provide an efficient and reliable platform for weapon asset management,
which will reduce the percentage of unaccountable or lost of weapons.

The AWATS system can reduce the number of on-duty Marines, who are required to
perform inventory of weapons at all times, especially prior to and after each troop
deployments. This reduction of manpower is significant at this time because the
restructuring of the U.S. armed services into a quickly deployed military force are on-
going. Additionally, the Marines have assumed more responsibilities in the war against
terrorism and in Iraq. These troops can be re-assigned to perform other more critical and
specialized tasks. At the same time, the deployment of AWATS systems can cut the
inventory time down to just 10% of what is being done by humans.1 The cost to perform
such tasks on the AWATS still will be reduced to just 50% of the original spending after
factoring in the cost of purchasing the new systems as well as their maintenance and
upgrades in the next 10 years.2 In addition, error rates in issuing and/or collecting the
correct rifles for all armories across the USMC will decrease from 1.55% to merely
0.08%.3

1.1.2.2 Function Overview

The AWATS system consists of following integral parts: (1) passive RFID tags; (2) 2-D
barcode scanners; (3) mounting hardware; (4) AWATS clients; (5) cabling; (6) RFID
interrogators. (Assumption: there exists fast Broadband Internet connection, T1 or higher,
at the armories and any AWATS related facilities.)

1
United States Marine Corps – Manpower & Task Efficiency Annual Report 2005; Section 8.2.1 & Charts
5, 6, & Figure 8 and 9 show the current manpower level required for armory tasks.
2
United States Marine Corps – Manpower & Task Efficiency Annual Report 2005; Section 8.3.4 & Chart
7, & Figure 10 show the cost required for operating a single armory.
3
United States Marine Corps – Manpower & Task Efficiency Annual Report 2005; Section 8.4.1 & Chart
9, & Figure 13 show the error ratio in armory inventory occurring in the USMC’s armories.

6
At the check-out window, the RFID interrogators will scan any firearms passing through
and associate the rifle’s unique ID with the recipients’ CAC ID, which must be presented
to and positively identified by an armory clerk. Prior issuing the rifle, the clerk must scan
the recipient’s CAC card into the system for validation and verification after a successful
visual identification.

In summary, the AWATS system is for:

 Assets Management
 Asset Accountability & Tracking
 Role-based Security
 Reporting (WinForms)

These software system features will be built and implemented using the following
packages:

 General Purpose Database Package


 Spreadsheet Package
 Configuration & Requirements Management Package
 Communication Package
 Graphics Presentation Package
 Word Processing Package
 Project Manager’s Package
 GPS Navigation Package
 Compile/Link/Runtime Packages for MS Visual Studio
 Language Independent Debugging & Testing Package
 Electronic Inventory & Tracking Package

General Purpose Database Package: COTS software Microsoft SQL Server 2005 will
be used to provide storage spaces for data tables, forms, views, reports, and queries for
each rifle matching either an issuance or collection transaction.

Spreadsheet Package: a reusable Excel DLL module will be employed for importing
and exporting statistical data.

Configurations & Requirements Management Package: reuse the CRMP components


built for previous projects inside the company for managing the project configuration and
the requirements.

Communication Package: the in-house component will be to wrap around the Secure
Sockets Layer (SSL) for secured communication links between all armories and the
AWATS servers as well as to the USMC’s Logistic Command in Albany, GA and to the
USMC Systems Command in Quantico, VA.

7
Graphics Presentation Package: an in-house component developed by Ivan Industries,
Inc. (under direction from ABC) will be integrated to enhance the AWATS data reporting
and statistical capabilities.

Word Processing Package: in-house software will be used to work with OpenOffice for
filing reports and general word processing purposes.

Project Manager’s Package: the widely used COTS product Microsoft Project will help
the Project Manager and all team leaders to manage the project schedule and task
progress.

GPS Navigation Package: reuse an existing in-house component to map the locations of
armories.

Compile/Link/Runtime Packages for MS Visual Studio: use the COTS suite of


Microsoft Visual Studio 2005 as the runtime and development environment for building
the AWATS software. C# will be the programming language.

Language Independent Debugging & Testing Package: create in-house testing tools
for unit tests in C#.

Electronic Inventory & Tracking Package: build an in-house component to track each
rifle storage status.

1.1.2.3 End-User Profile

The primary end-users of the AWATS systems will be armory clerks, whose duties are
usually performed by a team of one Gunnery Sergeant (E-7), a Staff Sergeant (E-6), and
eight more junior enlisted soldiers (E1-E5s). This team is responsible for the
accountability, maintenance, and safeguard of the assigned unit’s weapon armories.
These personnel have user roles on the system, and at a minimum must be proficient with
the general use of computers.

The Gunny and the Staff Sergeants will have administrative rights to file reports and
submit queries. These users will require a more in-depth background with computers.

The Supply and Audit personnel on the assigned unit will also have access to the
AWATS system using the role of privileged users such as only reporting for the Supply
and only querying for the Audit.

8
1.1.2.4 Development Environment

The development environment will consist of a server, client workstation, 2D barcode


scanner, passive RFID integrators, passive RFID tags, and mounting hardware.

The server will have the following configuration:


 OS: Windows 2003 Server, Enterprise Edition
 .Net Framework 2.0
 SQL Server 2005, Standard Edition
 IIS 6.0, Internet Explorer 6.0
 4 GB of RAM
 300 GB Hard Drive
 DVD/R/RW Drive
 3 USB Ports
 1394/Firewire connector
 Gigabit Ethernet Card

The client workstation will have the following configuration:


 OS: Windows XP Professional, SP2
 Internet Explorer 6.0
 .NET Framework 2.0
 2 GB RAM
 80 GB Hard Drive
 DVD Drive
 4 USB Ports
 1394/Firewire connector
 100 Mbps Ethernet Card
 Keyboard and Mouse
 19” LCD Screen

The development environment will be setup in a LAN with a connection between the
server and client workstation.

The client system will be deployed onto Desktops with the following configuration:
 2 Universal 2006-A microprocessors, 3.2 GHz
 17 inch display
 3 button mouse
 2 GB of RAM
 16 MB of Video RAM
 60 GB Hard Drive
 3.5 inch floppy
 DVD/R/RW and CD-RW Combo Drive
 Integrated 802.11g Wireless LAN

9
 3 USB ports
 1 FireWire connector
 1 Ethernet card
 1 56K modem

The deployment system will be bundled with:


 Laser Printer
 A Bar Code scanner (assumption: scanner can read 2 –D bar codes)
 Personal Assistant Device (131 megahertz, 32 MB RAM)
 System ABC enhanced keyboard
 A CRT monitor (1,280 X 1,024 non-interlaced; high resolution; bit-mapped; 21 inch
color display)
 A stand for the CRT monitor
 Power connector
 Internal speakers
 RFID interrogators
 RFID tags

1.1.2.5 Priorities and Constraints

AWATS most critical function is to enforce proper issuing and collections procedures for
armory firearms. The Database, Compile/Link/Runtime, and Project Management
Packages will be COTS products, and serve as the basis for AWATS development. A
Microsoft platform will be leveraged, with code based in C# (although Ivan Industries
may use their own tools when developing the Graphical Presentation Package, which is a
separate module). The Spreadsheet, Requirements Management, and GPS Navigation
packages will be constructed from existing proprietary, COTS, or legacy software
components. The Electronic Inventory and Tracking, Communications, Word Processing,
and Debugging/Testing packages will be developed by ABC and its RFID Applications
Division in Vienna, VA. The Graphical Presentation Package will be exported for
development by Ivan Industries, Inc.

1.1.2.6 Risk Factors

AWATS stores very sensitive information about the status of firearms, resulting in
possible life and death situations should a failure occur. The greatest risks will be in
securing communications between AWATS clients and the AWATS Data Repository
(ADR), and ensuring that all hardware failure scenarios (including RFID, server, and
client hardware) are handled with minimal downtime. Since redundancy is built into the
ADR, the probability of having all three servers (running in different locations)
inoperable at once is near zero. Since the failure rates of modern passive RFID tags is
around 20%4, contingency plans for bad tags must be implemented and integrated into the
4
http://www.ti.com/rfid/docs/news/in_the_news/2003/10-08-03.shtml

10
system. Although armory visibility grouping is a capability of AWATS (enabling several
armories to act as a single entity with regard to assets), it is imperative that armories not
grouped are unable to see each other’s inventory.

11
1.1.3 Project Summary

1.1.3.1 Project Overview

The goal of AWATS is to provide customers with a secure and user transparent
mechanism for enforcing weapon accountability and maintaining inventory. The AWATS
project will provide the customer with increased efficiency and reliability in enforcing
weapons accountability and inventory management. The benefits of the AWATS project
include proper enforcement of issuing/colleting of weapons from an armory, provide up
to the moment inventory and report capabilities for any armory currently using the
system, and provide for role based security.

The AWATS system will deliver 11 packages (General Purpose Database Package,
spreadsheet Package, Configuration & Requirements Management Package,
Communication Package, Graphics Presentation Package, Word Processing Package,
Project Manager’s Package, GPS Navigation Package, Compile/Link/Runtime Packages
for C#, Language Independent Debugging & Testing Package, and Electronic Inventory
& Tracking Package) that will utilize the System XYZ platform.

This project will last approximately two years with additional months for Alpha and Beta
testing. An additional rollout phase will occur when beta testing is complete. Cost figures
for the project are described below in the Cost Estimate table.

1.1.3.2 Profile of Typical Users

United States Enlisted Marine


An Enlisted Marine will interact with the system when he/she needs a weapon(s) issued.

United States Marine Armory Clerk


The Armory Clerk will interact with the system every time an Enlisted Marine makes a
weapon request. The Armory Clerk is responsible for the accountability, maintenance,
and safeguard of the armory. The Armory Clerk role entails scanning the Enlisted
Member’s CAC card, verifying the identity of the Enlisted Marine (by visual comparison
of the Enlisted Marine and his/her CAC card image), issuing/collecting a weapon from
the armory, and verifying the assignment was made in the AWATS system (using the
AWATS client).

1.1.3.3 Work Detail Summary

The total system cost of AWATS is $40,808,696.00. This figure is based on details from
Section 5 (Work Breakdown Structure) and Appendices B through F.

AWATS will be constructed by several development teams over a two year period. Unit
testing will occur throughout each module’s construction. In the final four months, the

12
entire system will go through two months of alpha testing, followed by two months of
beta testing before being released to the customer. More detail on the AWATS schedule
can be found in Section 5.5.

AWATS will be broken down into ten distinct areas (with respective costs):

1. Project Management $347,000.00


2. Technical Management $512,000.00
3. Quality Assurance (QA) $1,352,000.00
4. Configuration Management (CM) $678,000.00
5. Software Systems $23,606,000.00
6. Hardware $11,025,000.00
7. Kernel $3,000,000.00
8. Documentation $626,250.00
9. Software Support Environment $389,046.00
10. Post Deployment Software Support $2,162,000.00

Due to the nature of the SPMP, the Software Systems will be further broken down into
eleven packages:

I. COTS $6,373,728.00
1. Database $4,013,088.00
2. Compile/Link/Runtime $1,416,384.00
3. Project Management $944,256.00
II. Reuse $3,777,024.00
1. Spreadsheet $1,652,448.00
2. Requirements Management $944,256.00
3. GPS $1,180,320.00
III. Custom Code $12,983,520.00
1. Electronic Inventory $7,554,048.00
2. Secure Communications $2,596,704.00
3. Word Processing $1,180,320.00
4. Debugging/Testing $1,652,448.00
IV. Outsourcing $472,128.00
1. Graphics $472,128.00

Each of the above packages is completely broken down in Section 5.1.

1.1.3.4 Software Configuration Environment

ABC’s software configuration environment has been developed in-house over the last
five years. Extensions will be made as needed for this project.

The tool has the following features:


 Source Control with differential backups
 Daily Builds integrated with Unit Testing

13
 Bug and Change Request Tracking
 Automated Deployment to different environments
 An administrative module through an ASP .Net site
 Total integration with VS Studio 2005

14
1.2 AWATS Project Deliverables

1.2.1 Software Applications, Computer Software Configuration Item


(CSCI)

Database (CSCI-08-N01) – Central repository of all data and transactions throughout


system. Commercial-off-the-shelf (COTS) – SQL Server 2005. Application and
Documentation delivery date: August 30, 2008.

Spreadsheet (CSCI-08-N02) – Allows for reporting of data and customization of that


data. Reuse. Application and Documentation delivery date: August 30, 2008.

Configuration Management (CSCI-08-N03) – Used for bug tracking, source control, and
daily builds. Reuse. Application and Documentation delivery date: August 30, 2008.

Secure Communication Package (CSCI-08-N04) – Insures secure communication


between clients and server. Custom. Application and Documentation delivery date:
August 30, 2008.

Graphics Presentation Package (CSCI-08-N05) – Allows for reporting of system status


and data. Outsource. Application and Documentation delivery date: August 30, 2008.

Word Processing Package (CSCI-08-N06) – Allows for creation of documents by users.


Custom. Application and Documentation delivery date: August 30, 2008.

Project Manager’s Package (CSCI-08-N07) – Supports Project Management through


charts and document management. COTS – MS Project. Application and Documentation
delivery date: August 30, 2008.

GPS Navigation Package (CSCI-08-N08) – Ability to create maps of the location of


armories and servers throughout the world. Application and Documentation delivery
date: August 30, 2008.

Compile/Link/Runtime Package (CSCI-08-N09) – Tool for developers to compile and


create their code. COTS – MS Visual Studio 2005 Professional. Application and
Documentation delivery date: August 30, 2008.

Debug and Testing Package (CSCI-08-N10) – Tool for the automated testing of code and
the debugging of defects. Custom. Application and Documentation delivery date: August
30, 2008.

Electronic Inventory and Tracking Package (CSCI-08-N11) – The issuing and collection
of weapons is tracked through this package. Custom. Application and Documentation
delivery date: August 30, 2008.

15
1.2.2 Delivery Locations and Quantities

Location Quantity Success Metric


Norfolk, VA 100 Systems Install at least 50% of systems ordered and train 50
Jacksonville, FL 100 Systems users
Memphis, TN 150 Systems
Dallas, TX 50 Systems
San Diego, CA 100 Systems
Mishawaka, IN 150 Systems
Boston, MA 50 Systems
Mobile, AL 50 Systems
Table 3 – Delivery Locations & Quantities

1.2.3 Documentation

Document COTS Reuse Custom


Requirements Specification for extension of AWATs X X X
As-built Design Document     X
Documented Source Code     X
Test Plan and Test Cases X X X
Test Results X X X
Requirements Traceability Matrix X X X
User Reference Manuals     X
Training Materials X X X
Installation Guides Systems X X X
Table 4 – Documentation

16
1.3 Evolution of the Software Project Management Plan

Date Deliverable
September 18, 2006 Stage 1 - Project Overview, Project Deliverables, Evolution of Software
Project Management Plan
September 25, 2006 Stage 2 - Management Objectives and Priorities, Assumptions, Dependencies,
and Constraints, Resource Estimate, Schedule Estimate

October 2, 2006 Stage 3 - Process Model, Organizational Structure, Organizational Interfaces,


Project Responsibilities
October 23, 2006 Stage 4 - Technical Methods, Tools, and Techniques, Software
Documentation, Project Support Functions, Staffing Plan
October 30, 2006 Stage 5 - Monitoring and Controlling Mechanisms
November 13, 2006 Stage 6 - Work Packages, Dependencies, Resource Requirements, Budget and
Resource Allocation, Schedule
November 20, 2006 Stage 7 - Risk Management, Reference Materials, Definitions and Acronyms

December 4, 2006 Stage 8 - Additional Components, Index, Appendices


Table 5 – Evolution of the Software Project Management Plan

1.4 Reference Materials


1.4.1 RFID Standards

[1]
In researching RFID technologies we used the following website to provide standards and
other useful information: http://www.aimglobal.org/technologies/rfid/. The table below
lists several RFID standards we found helpful while developing the AWATS SPMP.

Standard
Abbreviation Description
Automatic Identification and Data Capture
JTC 1/SC 31 Techniques
JTC 1/SC 17 Identification Cards and related devices
ISO TC 104/SC4 Identification and communications
Table 6 – RFID Standards
[2]
We also found the White Paper entitled Understanding RFID Compliance Standards
helpful while developing the AWATS SPMP. This White Paper can be found at
http://www.dsionline.com/collateral/pdf/software/wp_RfidStandards.pdf.

17
1.4.2 Software Project Management Plan

[1]
Fairley, Richard E., “A Guide for Preparing Software Project Management Plans.”
George Mason University.

[2]
IEEE Std. 1058.1-1987. “IEEE Standard for Software Project Management Plans.”

[3]
CMMISM for Systems Engineering/Software Engineering/Integrated Product and Process
Development/Supplier Sourcing, Version 1.1, Staged Representation (CMMI-
SE/SW/IPPD/SS, V1.1, Staged)

1.5 Definitions and Acronyms


1.5.1 Definitions
Alpha Test – Simulated or actual operational testing by potential users/customers or an
independent test team at the developers’ site. Alpha testing is often employed for off-the-
shelf software as a form of internal acceptance testing, before the software goes to beta
testing.5

Baseline – Generally, a baseline may be a single work product, or set of work products
that can be used as a logical basis for comparison. A baseline may also be established
(whose work products meet certain criteria) as the basis for subsequent select activities.
Such activities may be attributed with formal approval.6

Beta Test – Represents the first version of a computer program that implements all
features in the initial software requirements specification. It is likely to be unstable but
useful for internal demonstrations and previews to select customers, but not yet ready for
release.7

C# – An object-oriented programming language developed by Microsoft as part of


their .NET initiative, and later approved as a standard by ECMA and ISO. C# has a
procedural, object-oriented syntax based on C++ that includes aspects of several other
programming languages (most notably Delphi, Visual Basic, and Java) with a particular
emphasis on simplification (fewer symbolic requirements than C++, fewer decorative
requirements than Java).8

Cost – Cost to develop a project depends on several variables including (chiefly): labor
rates, material rates, risk management, plant (buildings, machines, etc.), equipment, and
profit. When hiring an independent consultant for a project, cost will typically be
5
http://en.wikipedia.org/wiki/Alpha_test
6
http://en.wikipedia.org/wiki/Baseline_%28configuration_management%29
7
http://en.wikipedia.org/wiki/Beta_test
8
http://en.wikipedia.org/wiki/C_sharp

18
determined by the consultant's or firm's per diem rate multiplied by an estimated quantity
for completion.9

Constructive Cost Model – COCOMO is a model designed by Barry Boehm to give an


estimate of the number of man-months it will take to develop a software product.10

Configuration Management – A set of processes and technologies that support the


evolutionary life cycle of digital information. This digital information is often referred to
as content or, to be precise, digital content. Digital content may take the form of text,
such as documents, multimedia files, such as audio or video files, or any other file type
which follows a content lifecycle which requires management.11

Detailed Design – The process of defining the lower-level components, modules and
interfaces of a software system.12

Kernel - The central component of most computer operating systems (OSs). Its
responsibilities include managing the system's resources and the communication between
hardware and software components.13

Integration Testing – The phase of software testing in which individual software


modules are combined and tested as a group. It follows unit testing and precedes system
testing. Integration testing takes as its input modules that have been unit tested, groups
them in larger aggregates, applies tests defined in an Integration test plan to those
aggregates, and delivers as its output the integrated system ready for system testing.14

JAVA – An object-oriented programming language developed by Sun Microsystems in


the early 1990s. Java applications are designed to be compiled to bytecode, which is
interpreted at runtime, unlike conventional programming languages, which either compile
source code to native (machine) code or interpret source code at runtime.15

Life Cycle – A structure imposed on the development of a software product. Synonyms


include software life cycle and software process. There are several models for such
processes, each describing approaches to a variety of tasks or activities that take place
during the process.16

Milestone - Within the framework of project management a Milestone is a terminal


element that marks the completion of a work package or phase, typically marked by a
high level event such as completion, endorsement or signing of a deliverable, document

9
http://en.wikipedia.org/wiki/Project_management
10
http://en.wikipedia.org/wiki/COCOMO_II
11
http://en.wikipedia.org/wiki/Content_management
12
http://styx.esrin.esa.it/premfire/Docs/PSS0505.pdf
13
http://en.wikipedia.org/wiki/Kernel _%28computer_science%29
14
http://en.wikipedia.org/wiki/Integration_testing
15
http://en.wikipedia.org/wiki/Java_programming_language
16
http://en.wikipedia.org/wiki/Software_development_lifecycle

19
or a high level review meeting. Typically a milestone is associated with some sort of
decision that outlines the future of a project.17

Outsource – Involves transferring or sharing management control and/or decision-


making of a business function to an outside supplier, which involves a degree of two-way
information exchange, coordination and trust between the outsourcer and its client.18

Peer Review – In software development, peer review refers to a type of software review
in which a work product (normally some form of document) is examined by its author
and/or one or more colleagues of its author, in order to evaluate its technical content and
quality.19

Preliminary Design - Preliminary Design is the first phase of the design process. A
Project Manager from PM is assigned to the project and will coordinate a series of
meetings with Users and the Design Team for information gathering. Users communicate
specific needs/requirements and the Design Team will do field investigation regarding
the layout of the existing areas in question including building systems and their impact on
the project. The Design Team generates schemes based on the project information
provided by Project Management. Schemes will be reviewed by all stakeholders and
refined accordingly. This phase of the project defines the design parameters and lays out
the overall scheme.20

Quality – The Quality of a product or service refers to the perception of the degree to
which the product or service meets the customer's expectations. Quality has no specific
meaning unless related to a specific function and/or object. Quality is a perceptual,
conditional and somewhat subjective attribute.21

Quality Assurance – Covers all activities from design, development, production,


installation, servicing and documentation. It introduced the sayings "fit for purpose" and
"do it right the first time". It includes the regulation of the quality of raw materials,
assemblies, products and components; services related to production; and management,
production, and inspection processes.22

Requirements – A singular documented need of what a particular product or service


should be or do.23

17
http://en.wikipedia.org/wiki/Milestone_%28Project_management%29
18
http://en.wikipedia.org/wiki/Outsource
19
http://en.wikipedia.org/wiki/Software_peer_review
20
http://www.med.yale.edu/fdo/pmc/process/pd.html
21
http://en.wikipedia.org/wiki/Quality#In_Engineering_and_Manufacturing
22
http://en.wikipedia.org/wiki/Quality_assurance
23
http://en.wikipedia.org/wiki/Requirements

20
Reuse – The idea that a partial or complete computer program written at one time can be,
should be, or is being used in another program written at a later time. The re-use of
programming code is a common technique which attempts to save time and energy by
reducing redundant work.24

Risk – A concept that denotes a potential negative impact to an asset or some


characteristic of value that may arise from some present process or future event. In
everyday usage, "risk" is often used synonymously with the probability of a loss or threat.
In professional risk assessments, risk combines the probability of an event occurring with
the impact that event would have and with its different circumstances.25

Scope – Requirements specified for the end result. The overall definition of what the
project is supposed to accomplish, and a specific description of what the end result
should be or accomplish. A major component of scope is the quality of the final product.
The amount of time put into individual tasks determines the overall quality of the project.
Some tasks may require a given amount of time to complete adequately, but given more
time could be completed exceptionally. Over the course of a large project, quality can
have a significant impact on time and cost (or vice versa).26

Spiral Development – A software development process combining elements of both


design and prototyping-in-stages, in an effort to combine advantages of top-down and
bottom-up concepts. The spiral model was defined by Barry Boehm in his article A Spiral
Model of Software Development and Enhancement from 1985. This model was not the
first model to discuss iterative development, but it was the first model to explain why the
iteration matters. As originally envisioned, the iterations were typically 6 months to 2
years long. Each phase starts with a design goal and ends with the client (who may be
internal) reviewing the progress thus far. Analysis and engineering efforts are applied at
each phase of the project, with an eye toward the end goal of the project.27

System Testing – Takes (as input) all of the "integrated" software components that have
successfully passed Integration testing and also the software system itself integrated with
any applicable hardware system(s). The purpose of Integration testing is to detect any
inconsistencies between the software units that are integrated together (called
assemblages) or between any of the assemblages and the hardware. System testing is a
more limiting type of testing; it seeks to detect defects both within the "inter-
assemblages" and also within the system as a whole.28

Time – The duration available to complete a project. Broken down for analytical
purposes into the time required to complete the components of the project, which is then
further broken down into the time required to complete each task contributing to the
completion of each component.29
24
http://en.wikipedia.org/wiki/Code_reuse
25
http://en.wikipedia.org/wiki/Risk
26
http://en.wikipedia.org/wiki/Scope_%28project_management%29
27
http://en.wikipedia.org/wiki/Spiral_model
28
http://en.wikipedia.org/wiki/System_testing
29
http://en.wikipedia.org/wiki/Scope_%28project_management%29

21
Unit Testing - A procedure used to validate that a particular module of source code is
working properly from each modification to the next. The procedure is to write test cases
for all functions and methods so that whenever a change causes a regression, it can be
quickly identified and fixed. Ideally, each test case is separate from the others; constructs
such as mock objects can assist in separating unit tests. This type of testing is mostly
done by the developers and not by end-users.30

Waterfall Model – A sequential software development model (a process for the creation
of software) in which development is seen as flowing steadily downwards (like a
waterfall) through the phases of requirements analysis, design, implementation, testing
(validation), integration, and maintenance.31

Work Breakdown Structure – An exhaustive, hierarchical (from general to specific)


tree structure of deliverables and tasks that need to be performed to complete a project.
The Work Breakdown Structure is a very common and critical project management tool.
It is considered such a key part of project management that many United States
government statements of work require a WBS.32

Work Packages – A subset of a project that can be assigned to a specific party for
execution. Because of the similarity, work packages are often misidentified as projects.33

1.5.2 Acronyms

Acronym Meaning Notes


SPMP Software Self-Explanatory.
Project
Management
Plan
CSCI Computer Self-Explanatory.
Software
Configuration
Item
AWATS Automated Used to denote the entire system to be
Weapon developed for the USMC by ABC.
Accountability and
Tracking
System
RFID Radio An automatic identification method, relying
Frequency on storing and remotely retrieving data
IDentification using tags which may be read by
interrogators using radio waves.

30
http://en.wikipedia.org/wiki/Unit_testing
31
http://en.wikipedia.org/wiki/Waterfall_model
32
http://en.wikipedia.org/wiki/Work_breakdown_structure
33
http://en.wikipedia.org/wiki/Work_package

22
ADR AWATS Central distributed data repository for
Data AWATS. Consists of several servers
Repository running in different locations to handle
failover scenarios.
RDCL Redundant Responsible for handling data transfers
Database between servers in the ADR. Includes
Communication failover business logic.
Layer
IIML Intelligent Brokers transactions between AWATS
Inventory clients and the RDCL. Handles all business
Management layer logic for issue/collection transactions.
Layer
SCL Secure Built using SSL to encrypt all data traffic
Communication between AWATS clients and the ADR via
Layer the RDCL.
DR Disaster The planning and creation of a plan to deal
Recovery with a disaster of catastrophic scale (e.g.
earthquake, tornado, hurricane, etc.)
Table 7 – Acronym Listing

23
2.0 Project Organization
2.1 Process Model

Due to the fact that a pilot project has been completed and accepted by the USMC prior
to winning the actual contract, it is assumed that the AWATS software system
requirements have been baselined upon an agreement between the USMC and ABC, Inc.
This in turn results in having a lower level of risk for this project. Therefore, the Royce’s
Waterfall software development process model will be employed to guide the project
team through the different phases of system requirement analysis, high-level design, low-
level design, development and implementation, testing and validation, integration, and
system maintenance.

24
2.1.1 Process Milestones
These milestones charts describe all the process model milestones in the entire project.

25
Software Requirement Review

Preliminary Design Review

System Integration & Test


Critical Design Review

System Alpha Test

System Beta Test


BUSINESS NEEDS

SYSTEM REQUIREMENTS

Identify Research & Select & Integrate Test the COTS Reconfigure
COTS to work
COTS Evaluate COTS, Test COTS COTS along with properly with
Reqt’s for integration usability. Packages into AWATS AWATS
feasibility System features
Integration &
Test

Requirements were Integrate all


completed in the Reuse Design system System System
Pilot Project phase & Prototype compo-nents Functionali Functionali
ty Test ty Test
Coding &
Unit
Testing

Finalize Performance
Prototype Test Performan
ce Test

Code outsourcing
Identify Evaluate & Select components; Integrate
Outsource Outsource modify reusable Reusable & Complete
vendors; identity vendor; reusable pks, if necessary Outsourced Fixing Bugs Performan
reusable packages pks components ce Test

26
Figure 4 – Process Model Milestone Chart

27
28
AWATS Software System Installation

Project Phase Out


System Beta Test

Add/Update/ Change User Training User Feedbacks


functions due to Beta Test,
if feasible and resources Update
availability COTS, Fix
COTS
Issues

Final Reports,
Install Contract
Create final CM AWATS on Deliverables,
build for sites Admin
production Documentations
release to Client

Fix Bugs Re-testing


reported from new/updated
29
Beta release functions Technical System
Support Maintenance
reusable &
outsourcing
components

Figure 5 - Process Model Milestone Chart (continued)

30
The following tables describe the activities, benchmarks, milestones achieved along with documents submitted and approved.

ACTIVITIES BENCHMARKS MILESTONES/ SUCCESS INDICATOR


Project Initiation  Create Project Charter & Draft SPMP  Project space, equipments, networks ready to use
 Form Organization  Development Environment Plan created
 Establish Baseline Schedule  SPMP in place and signed off
 Development Methods, Process, and Tools  Software Development Process selected
 Staffing  Training Plan in Place
 Establish Project Member Training  Management Baseline established

Requirements  Analyze &Define Requirements  Hardware requirement defined


 Final Staffing Estimates & Budgets in place  Requirements documented
 Review Requirements for Development &  Functional & allocate baseline established
Testability  Features and time allocation estimation Sign off
 Create requirements traceability matrices.  SLOC Revised
 SLOC Estimate revisited  Assumption: The requirement gathering phase has
 Define Product Interface completed because of client’s acceptance of the
 Finalize SPMP requirements from the AWATS pilot project.
 Develop High Level Design
Evaluation COTS &  Evaluate COTS  COTS & Reusable components evaluated
Reusable  Evaluate Reusable Components  COTS & Reusable components tested
Components  Test COTS & Reusable Components
 Draft a Test Plan

Preliminary Design  Develop Preliminary Software Design  Preliminary Design & Review completed
 Re-estimate SLOC  Functional Specifications completed
 Create Unit Test Plan  Assumption: The requirements have been finalized.

Detailed Design  Hold Low Level Design Peer Review  Detailed Design Review Closed
  Interface Baseline Closed

31
 Create Detailed Program Designs  Design Baseline Established
 Establish Software Development Folders
 Re-estimate SLOC
 Start End Users Manual
 Start I&T Plan
 Start Maintenance Documentation
 Start Fielding Plan

Coding & Unit Test  Create code according to the HLD & LLD  Check in code for each components into CM
documents depository
 Unit Test  Source code listings
 Document Test Cases & Results  System Test Plan complete
 Write System Test Plan  System Integration &Testing Procedures completed
 Unit Test completed
 Code Baseline established

Software Integration  User validation of components  Document Test Results
& Test  Integrate all 11 packages  Document Traceability Matrix
 Integrate all AWATS components  Integration Test release build successfully
 Integration Test  Documentation for each components
 Start End Users Manual  End Users Manual drafted

System Test  System Test  CM builds for Alpha & Beta tests
o Functional Test  Document Test Results & Traceability Matrix
o Performance Test  Final software acceptance review by the users
o User Acceptance Test (clients).
 Alpha Test released  Product Baseline Established
 Once Alpha Test is completed, Beta Test released  Finalize End User Manual & Maintenance Manuals
 Fixing bugs  CM creates production build

Install, Maintenance,  Provide Customer Support for Alpha and Beta  Program Library Closed & all documentation

32
& Support Tests mothballed
 Release Software Patches as necessary
 Plan & Release “Dot” Level Software upgrades
in response to customer & marketing requests
 Provide system maintenance training

Phase Out  Re-allocate resources  Staff transitions completed


 Hand Off responsibility to Maintenance  Deliver contract completion report to the clients for
Organization sign-off
 Hold lesson learned sessions  Deliver final report to the internal business unit for
project close-out
Table 8 – Activities, Benchmarks, and Milestones

33
2.1.2 Baseline

Our baseline will come from the pilot project that was completed and accepted by the
USMC prior to winning this contract. All requirements will come from this baseline and
the waterfall process will start from this baseline.

2.1.3 Reviews

2.1.3.1 Software Requirements Review

The customer will be presented with a Software Requirements Specification (SRS) to


review for two weeks at this review date. It will include precise requirements for every
feature and package to be included in the system. A Requirements Traceability Matrix
will also be included for making sure that the requirements are attained throughout each
phase of the waterfall process. The SRS should be signed off by the customer
representative after two weeks of review and will become a binding agreement that holds
higher priority than the RFP.

2.1.3.2 Preliminary Design Review

The customer will be invited to attend an internal review of the preliminary design at this
point in the process. The customer will be able to ask questions, but all design decisions
should be left to the development team if at all possible. The Lead Systems Analyst,
Lead Requirements Analyst, and Lead Programmer will lead this review with the
Programming and Analyst Team Leader making all the final decisions.

2.1.3.3 Critical Design Review

This will be closed-door all hands meeting of the entire project team as the Reusable
Components Team, COTS Enhancement Team, Custom Development Team, and Ivan
Industries Team present their design, source code, and unit test results. After that, each
team will face an intense questioning period by the Lead Programmers from the other
teams. Decisions on changes to the system will be made by consensus of the Lead
Programmers and will be documented and implemented after this review.

2.1.3.4 System Integration & Test

Once all developments of AWATS software components are completed, all 11 packages
will be integrated and checked in to be under the CM repository and control. This will
also include any new or updated codes that will be interacting with the COTS, reusable,
and outsourced components. Then all codes will be merged.

At this time, all Software Development Folders will be turned in to the software manager
for review. The software manger should verify the code against the Requirement

34
Traceability Matrices to ensure that the right components are being coded and unit-tested.
Upon completion of reviewing and approving the Folders, the software manager will
request the project’s Configuration Management Officer to create a baselined build for
Integration Test.

Integration Test is to verify that the AWATS’ different components and packages are
working together correctly. The Integration Test is usually performed by a team drawn
from both the development team and the system test team. This phase not only verifies
the AWATS functionalities, it also intensively performs white-box testing on the backend
components of the system. For example: the RFID transactions will be verified whether
they have been transmitted to the AWATS servers successfully; whether the sessions are
secure and closed upon completion; whether the correct personal data of the Marines has
been verified and entered into the database table correctly.

Any bugs found during this phase will be fixed and checked into the CM repository. All
fixes will be included and delivered to System Test team for Alpha Testing.

2.1.3.5 System Alpha Test

The System Test Team and the Software Development Team will support this phase.

An AWATS test environment will be set up internally to verify all AWATS


functionalities and performance issues against the Requirements Traceability Matrices.
The System Test team will be responsible to thoroughly find and report all problems.
The software team will support fixing any bugs found during Alpha Test. All fixes and
changes will be checked into the CM repository.

During Alpha Test, AWATS system’s user documentations are also being drafted.
Resources will be pulled from the documentation team, software team as well as the
client’s inputs.

2.1.3.6 System Beta Test

An AWATS test environment will be set up to simulate the production configuration and
settings. The software development team will perform technical and help desk supports
to both external and internal users during the Beta Test phase. Any feedbacks on system
problems and expectations will be either addressed and fixed or recorded for future
enhancements.

User Acceptance Test will also be performance during this phase. Once the client’s end
users satisfy with the system functionalities, AWATS will be approved for a production
build and will be deployed for on-site installations.

User Manuals and other documentations will be reviewed and finalized with the approval
from the management and the client. Upon approval, all documentation will be included
for release with the production during on-site installations.

35
2.1.3.7 AWATS Software System Installation

This will be the final review before the system is put into production. The source code
will have been frozen for at least two weeks, compiled, and burned onto distribution
CD’s to be installed at client sites. Personnel will be tested to insure they are properly
trained to install the system at client sites. The customer will be intimately involved at
this point in the project to help predict potential political problems with the user base and
mitigation plans if they are to occur. Documentation will be delivered to the distribution
point in finalized form. The help desk will be put through a 48-hour test to make sure it
is up to established customer service requirements. A knowledge base will be setup and
tested also to insure that the help desk has easy access to frequently incurred problems.
Points-of-contact will be contact at each installation site to setup the installation and go
over the process and any possible questions.

2.1.3.8 Project Phase-Out

After the six month maintenance period, a review meeting will be held with the project
team management and the customer to insure that every requirement has been adequately
fulfilled. The Requirements Traceability Matrix will be a guiding document during this
part of the review. A separate contract will be proposed at this point to continue support
of the system and any enhancements the customer would like at this point. A final sign-
off from the customer will be required and a Project History Document will be presented
to the customer that will have recommendations to any future development team that
takes over the system.

36
2.2 Organizational Structure

This section describes the organizational structure of ABC. ABC is led by the Chief
Executive Officer (CEO) and Chairman, followed by the Chief Financial Officer (CFO),
Chief Operating Officer (COO), and Chief Technology Officer (CTO). This structure is
presented below.

Board of
CEO and Chairman
Trustees

Business Chief Financial Chief Operating Chief Technology


Units Officer Officer Officer

Information VP Finance and VP Legal VP Sales and VP Hardware VP Software


Technology Administrator Affairs Marketing Division Division

Director Program Program


Professional Director
Comptroller Commercial Manager Manager
Services Litigation
Sales Communications Systems
and Networking Development
Director Director
Director
Aeronautics Patent Government
Contracting Program Program
Prosecution Sales
Manager Manager
Microprocessor Software
Space Director Development Development
Director Facilities
Systems Marketing

Program Program
Manager Manager
Director Human Research and Systems
Resources Development Integrations

Program
Director Security
Manager RFID

Figure 6 – Corporation Structure

37
The AWATS project will be led by the Vice-President of the Software Division. The
organization is listed below.

VP Software Division

Program Manager Program Manager


Program Manager
New Systems Systems Software
Systems Integration
Development Development

Project Manager Project Manager


Project Manager RFID
AWATS
Operating Systems
Development

Project Manager E-
Commerce
Project Manager
Project Manager
2007-A Kernel
Distributed Systems
Development
Project Manager
Hardware Integration

Project Manager
Utilities

Figure 7 – Vice-President Software Division Structure

38
The AWATS project organization will be led by the RFID Systems Development
Program Manager.

Project Manager AWATS Assistant Project


Development Manager

Technical & Reusable COTS Custom


Administrative Software Quality Ivan Industries
Documentation Components Enhancement Development
Support Control Project Manager
Support Project Manager Project Manager Project Manager

Documentation Configuration
Administrative Requirements Requirements Requirements Requirements
Management
Team Assistant Management Management Management Management
Team

Help Desk Team Human Quality COTS Integration


Design Team Design Team
Resources Assurance Team Team
Specialist
Installation Team
Test and Programming & Programming &
Training Team Evaluation Team Analyst Team Analyst
Network Support
Specialist

Manual Team

Procurement
Specialist

Figure 8 – Project Management Structure

39
Quality Assurance

Programming &
Configuration Management Analyst Team Leader

Lead Requirements Lead Systems


Lead Programmer
Analyst Analyst

Requirements
Senior Programmer Senior Analyst
Specialists

Junior Programmer Junior Analyst

RFID Engineer
(Optional)

Figure 9 – Project Team Structure

40
2.3 Organizational Boundaries and Interfaces

This section relates the managerial boundaries that exist between the AWATS project and
other segments of ABC and the customer. The following diagram shows related
interfaces with the AWATS Program Manager.

VP Hardware Division CIO Legal

Finance
Microprocessor Program
Manager VP Software Division
Sales/Marketing Division
RFID Integration Manager
Customers
AWATS Program
8696 Processor Project
Manager
Manager
Program Manager System
Software Division
Reusable Components
Project Manager

Custom Development
Project Manager 2007-A Kernel Project
AWATS Project Manager
Manager
COTS Enhancements
Project Manager

Ivan Industries Project


Manager

Figure 10 – Program Manager Organizational Interfaces

41
The following diagram shows related interfaces with the AWATS Project Manager.

VP Hardware Division CIO Legal

Finance
Microprocessor Program
Manager VP Software Division
Sales/Marketing Division
RFID Integration Manager
Customers
AWATS Program
8696 Processor Project
Manager
Manager
Program Manager System
Software Division
Reusable Components
Project Manager
AWATS Project Manager
2007-A Kernel Project
Custom Development
Manager
Project Manager

COTS Enhancements
Project Manager Administrative Support &
Overhead
Ivan Industries Project
Manager Helpdesk &
Software QA/QC Lead: Documentation Lead:
- CM Staff -Training Staff
- QA Staff -Manual Staff
- T&E Staff - Documentation Staff
- Help Desk Staff

Figure 11 – Project Manager Organizational Interfaces

42
2.4 Project Responsibilities
2.4.1 Project Manager

The Project Manager is responsible for overseeing all project activities. The project
manager is role includes maintaining control over the scope of the project, assuring
quality work from the team, and reporting status to the customer.

2.4.2 Assistant Project Manager

The Assistant Project Manager responsibility is to assist the project manager in any of the
responsibilities listed above. The assistant project manager is also responsible for the
morale of the project team.

2.4.3 Chief Programmer

The Chief Programmer is responsible for managing the design, development, and testing
of the system software. The chief programmer also authors all low-level reusable project
code libraries.

2.4.4 Secretary

The Secretary’s responsibility involves authoring all meeting minutes and handling the
administration of the project.

2.4.5 Systems Engineer/Analyst

The Systems Engineer/Analyst is responsible for putting together the framework of the
project.

2.4.6 Requirements Analyst

The Requirements Analyst is responsible for extracting and documenting the customer
requirements.

2.4.7 Technical Team Leader

The Technical Team Leader responsibility is to manage the development team. This
involves staffing and tasking of all development personnel. The technical team leader
assumes responsibility of all development work.

2.4.8 Programmer

The Programmer’s main responsibility is the development of the software

43
2.4.9 Tester

The Tester is responsible for testing the system. The tester will develop test scenarios to
ensure that the customer’s requirements are met by the project packages, document all
results of the test scenarios, and provide the results to the Technical Team Lead.

2.4.10 Help Desk Technician

The Help Desk Technician will provide technical support during system test and
integration of the system.

2.4.11 RFID Integration Manager

The RFID Integration Manager is responsible for the integration of RFID technology into
the system. This includes systems engineering, as well as interface specification between
the hardware and software.

2.4.12 RFID Engineer

The RFID Engineer is responsible for the actual integration of the RFID hardware into
the software packages. The RFID Engineer is also responsible for testing the RFID
equipment.

2.4.13 Project Specialists

The project specialists are individuals who specialize in particular area of the software
development process. These roles are:

 Training Specialist:
The Training Specialist is responsible for organizing and conducting all user training.

 Documentation Specialist:
The Documentation Specialist is responsible for authoring all user documentation and
maintenance documentation for the software. This individual may work in
conjunction with the Training Specialist to produce training documentation.

 Human Resource (HR) Specialist:


The Human Resource Specialist is responsible for procuring the necessary project
resources. This individual may interview and screen applications for the project.

 Procurement Specialist:
The Procurement Specialist is responsible for purchasing all necessary hardware and
software for the project. The Procurement Specialist is also responsible for reporting
purchase information to the customer.

44
 Configuration Management (CM) Specialist:
The Configuration Management Specialist is responsible for all version control of
documents and software releases. The configuration management specialist is
required to document all changes to the software for each build and provide a process
for maintaining version control throughout the testing and integration.

 Quality Assurance (QA) Specialist:


The Quality Assurance Specialist is responsible for checking each document for
errors.

45
Database Responsibility Matrix
Requirements Design Coding Integration System Total
and Unit Testing Testing
Function Testing
Project Manager 2.4 2.4 2.4 2.1 2 11.3
Assistant Project Manager 2.4 2.45 2.4 2.1 2 11.4
Chief Programmer 2.2 2.6 3.5 2.3 1.4 12
Secretary 1.5 1.5 1.5 1.5 1.5 7.5
Systems Engineer/Analyst 2.1 2.1 1.8 1.8 1.8 9.6
Requirements Analyst 3.4 4.5 3.2 3.2 2.1 16.4
Technical Team Leader 3.5 4.5 3.2 3.2 3.2 17.6
Programmer 3.2 3.5 4.4 3.4 3.4 17.9
Tester 1.2 1.2 3.2 3.2 2.4 11.2
Help Desk Technician 0 0 0 1.2 1.9 3.1
RFID Integration Manager 1.2 2.1 2.1 2.1 2.1 9.6
RFID Engineers 1.2 2.1 2.1 2.1 2.1 9.6
Training Specialist 0 0 0.4 1.2 2 3.6
Documentation Specialist 1.17 1.17 1.17 1.17 1.3 5.98
HR Specialist 1 1 1 1 1 5
Procurement Specialist 1.2 1.2 1.2 1.2 1.2 6
CM Specialist 1.2 1.2 1.2 1.2 1.2 6
QA Specialist 1.2 1.2 1.2 1.2 1.2 6
Total Staff Months 30.07 34.72 35.97 35.17 33.8 169.7
Table 9 – Database Responsibility Matrix

46
Spreadsheet Responsibility Matrix
Requirements Design Coding Integration System Total
and Unit Testing Testing
Function Testing
Project Manager 0.78 0.78 0.78 0.78 0.78 3.9
Assistant Project Manager 0.78 0.78 0.78 0.78 0.78 3.9
Chief Programmer 0.89 1.5 1.9 1.5 1.5 7.29
Secretary 0.54 0.54 0.54 0.54 0.54 2.7
Systems Engineer/Analyst 1.9 1.2 1.2 1.2 1.2 6.7
Requirements Analyst 1.9 1.1 0.89 0.89 0.56 5.34
Technical Team Leader 2.2 2.2 2.2 2.2 2.2 11
Programmer 2.2 2.2 2.2 2.2 2.2 11
Tester 0.1 0.12 0.89 0.89 0.89 2.89
Help Desk Technician 0 0 0 0.54 0.54 1.08
RFID Integration Manager 2.2 2.2 2.2 2.5 2.2 11.3
RFID Engineers 2.2 2.2 2.2 2.5 2.2 11.3
Training Specialist 0 0 0 0.54 0.54 1.08
Documentation Specialist 0.34 0.89 0.89 0.6 0.6 3.32
HR Specialist 0.18 0.18 0.18 0.18 0.18 0.9
Procurement Specialist 0.12 0.12 0.12 0.12 0.12 0.6
CM Specialist 0.6 0.89 0.89 0.89 0.6 3.87
QA Specialist 0.6 0.6 0.6 0.6 0.6 3
Total Staff Months 17.53 17.5 18.46 19.45 18.23 91.2
Table 10 – Spreadsheet Responsibility Matrix

47
Requirements Management Responsibility Matrix
Requirements Design Coding Integration System Total
and Unit Testing Testing
Function Testing
Project Manager 0.78 0.78 0.78 0.78 0.78 3.9
Assistant Project Manager 0.78 0.78 0.78 0.78 0.78 3.9
Chief Programmer 1.8 2.1 2.5 2.5 2.5 11.4
Secretary 0.78 0.78 0.78 0.78 0.78 3.9
Systems Engineer/Analyst 2.1 2.1 2.1 2.1 2.1 10.5
Requirements Analyst 2.1 2.1 1.1 1.1 1.1 7.5
Technical Team Leader 1.8 2.1 2.5 2.5 2.5 11.4
Programmer 1.8 2.1 2.5 2.5 2.5 11.4
Tester 0.45 0.45 1.2 1.2 1.2 4.5
Help Desk Technician 0 0 0 0.54 0.54 1.08
RFID Integration Manager 1.8 2.1 2.5 2.5 2.5 11.4
RFID Engineers 1.8 2.1 2.5 2.5 2.5 11.4
Training Specialist 0 0 0.34 0.54 0.54 1.42
Documentation Specialist 0.6 0.6 0.6 0.6 0.6 3
HR Specialist 0.18 0.18 0.18 0.18 0.18 0.9
Procurement Specialist 0.23 0.45 0.45 0.23 0.23 1.59
CM Specialist 0.89 0.89 0.89 0.89 0.89 4.45
QA Specialist 0.89 0.89 0.89 0.89 0.89 4.45
Total Staff Months 18.78 20.5 22.59 23.11 23.11 108.1
Table 11 – Requirements Management Responsibility Matrix

48
Communications Responsibility Matrix
Requirements Design Coding Integration System Total
and Unit Testing Testing
Function Testing
Project Manager 0.78 0.78 0.78 0.78 0.78 3.9
Assistant Project Manager 0.78 0.78 0.78 0.78 0.78 3.9
Chief Programmer 0.89 2 2.8 2.3 2.3 10.3
Secretary 0.78 0.78 0.78 0.78 0.78 3.9
Systems Engineer/Analyst 0.78 0.68 0.67 0.56 0.56 3.25
Requirements Analyst 0.89 0.68 0.67 0.56 0.56 3.36
Technical Team Leader 0.89 1.8 2.8 2.3 2.3 10.1
Programmer 0.67 1.8 2.6 2.3 2.3 9.67
Tester 0.12 0.12 0.89 0.89 0.89 2.91
Help Desk Technician 0 0 0 0.54 0.54 1.08
RFID Integration Manager 0.89 1.8 2.3 2.3 2.3 9.59
RFID Engineers 0.89 1.8 2.3 2.3 2.3 9.59
Training Specialist 0 0 0 0.54 0.54 1.08
Documentation Specialist 0.34 0.45 0.45 0.45 0.45 2.14
HR Specialist 0.18 0.18 0.18 0.18 0.18 0.9
Procurement Specialist 0.12 0.12 0.12 0.12 0.12 0.6
CM Specialist 0.43 0.43 0.43 0.43 0.43 2.15
QA Specialist 0.43 0.43 0.43 0.43 0.43 2.15
Total Staff Months 9.86 14.63 18.98 18.54 18.54 80.6
Table 12 – Communication Responsibility Matrix

49
Graphics Presentation Responsibility Matrix
Requirements Design Coding Integration System Total
and Unit Testing Testing
Function Testing
Project Manager 2.4 2.3 2.3 2.2 2.2 11.4
Assistant Project Manager 2.4 2.3 2.3 2.2 2.2 11.4
Chief Programmer 2.1 2.5 2.8 2.8 2.8 13
Secretary 1.2 1.2 1.2 1.2 1.2 6
Systems Engineer/Analyst 2.1 2.5 2.1 2.1 2.1 10.9
Requirements Analyst 2.1 2.1 1.4 1.4 1.4 8.4
Technical Team Leader 2.3 2.5 2.6 2.6 2.6 12.6
Programmer 2.1 2.4 2.6 2.6 2.6 12.3
Tester 0.12 0.34 2.2 2.2 2.3 7.16
Help Desk Technician 0 0 1 1.5 1.5 4
RFID Integration Manager 2.1 2.1 2.6 2.6 2.6 12
RFID Engineers 2.1 2.1 2.6 2.6 2.6 12
Training Specialist 0 0 0 0.65 0.65 1.3
Documentation Specialist 1.9 1.9 1.3 1.2 1.2 7.5
HR Specialist 1 1 1 1 1 5
Procurement Specialist 1.1 1.1 0.89 0.89 0.89 4.87
CM Specialist 1.2 1.2 1.2 1.2 1.2 6
QA Specialist 1.2 1.2 1.2 1.2 1.2 6
Total Staff Months 27.42 28.74 31.29 32.14 32.24 151.8
Table 13 – Graphics Presentation Responsibility Matrix

50
Word Processing Responsibility Matrix
Requirements Design Coding Integration System Total
and Unit Testing Testing
Function Testing
Project Manager 0.89 0.89 0.78 0.78 0.78 4.12
Assistant Project Manager 0.89 0.89 0.78 0.78 0.78 4.12
Chief Programmer 0.67 0.67 0.89 0.89 0.89 4.01
Secretary 0.78 0.78 0.78 0.78 0.67 3.79
Systems Engineer/Analyst 1.9 1.9 1.1 1.1 1.1 7.1
Requirements Analyst 1.9 1.9 1.1 1.1 1.1 7.1
Technical Team Leader 0.67 0.78 0.89 0.89 0.89 4.12
Programmer 0.67 0.78 0.89 0.89 0.89 4.12
Tester 0 0 1.1 1.1 1.1 3.3
Help Desk Technician 0 0 0 0.45 0.45 0.9
RFID Integration Manager 0.89 0.89 0.89 0.89 0.89 4.45
RFID Engineers 0.89 0.89 0.89 0.89 0.89 4.45
Training Specialist 0 0 0 0.45 0.35 0.8
Documentation Specialist 0.45 0.45 0.67 0.67 0.67 2.91
HR Specialist 0.45 0.45 0.45 0.45 0.45 2.25
Procurement Specialist 0.89 0.89 0.4 0.4 0.4 2.98
CM Specialist 1.1 1.1 1.1 1.1 1.1 5.5
QA Specialist 0.89 0.89 0.89 0.89 0.89 4.45
Total Staff Months 13.93 14.15 13.6 14.5 14.29 70.5
Table 14 – Word Processing Responsibility Matrix

51
Program Management Responsibility Matrix
Requirements Design Coding Integration System Total
and Unit Testing Testing
Function Testing
Project Manager 1 1 0.89 0.89 0.89 4.67
Assistant Project Manager 1 1 0.89 0.89 0.89 4.67
Chief Programmer 1.1 1.4 1.5 1.5 1.5 7
Secretary 0.54 0.78 0.78 0.54 0.54 3.18
Systems Engineer/Analyst 1.2 1.1 1.1 0.89 0.78 5.07
Requirements Analyst 1.2 1.1 0.9 0.89 0.78 4.87
Technical Team Leader 1.2 1.1 1.4 1.4 1.4 6.5
Programmer 1.2 1.4 1.5 1.5 1.5 7.1
Tester 0.12 0.12 1.1 1.1 1.1 3.54
Help Desk Technician 0 0 0 0.89 0.89 1.78
RFID Integration Manager 1.2 1.1 1.5 1.5 1.5 6.8
RFID Engineers 1.2 1.1 1.5 1.5 1.5 6.8
Training Specialist 0 0 0 0.54 0.54 1.08
Documentation Specialist 0.2 0.89 0.89 0.89 0.89 3.76
HR Specialist 0.54 0.54 0.54 0.54 0.54 2.7
Procurement Specialist 0.12 0.23 0.23 0.12 0.12 0.82
CM Specialist 1 1 1 1 1 5
QA Specialist 1 1 1 1 1 5
Total Staff Months 13.82 14.86 16.72 17.58 17.36 80.3
Table 15 – Project Management Responsibility Matrix

52
GPS Navigation Responsibility Matrix
Requirements Design Coding Integration System Total
and Unit Testing Testing
Function Testing
Project Manager 1.2 1.2 1.2 1.2 1.2 6
Assistant Project Manager 1.2 1.2 1.2 1.2 1.2 6
Chief Programmer 1.2 1.7 2.3 1.5 1.5 8.2
Secretary 1.2 1.2 1.2 1.2 1.2 6
Systems Engineer/Analyst 1.9 1.7 1.2 1.2 1.2 7.2
Requirements Analyst 1.9 1.7 1.2 1.2 1.2 7.2
Technical Team Leader 1.3 1.7 2.4 1.7 1.7 8.8
Programmer 1.3 1.7 2.3 1.7 1.7 8.7
Tester 0.3 0.3 1.2 1.2 1.2 4.2
Help Desk Technician 0 0 0.2 0.54 0.54 1.28
RFID Integration Manager 1.3 1.5 1.7 1.7 1.7 7.9
RFID Engineers 1.3 1.5 1.7 1.7 1.7 7.9
Training Specialist 0 0 0 0.89 0.89 1.78
Documentation Specialist 0.89 1.3 1.3 1.3 1.3 6.09
HR Specialist 0.18 0.18 0.23 0.18 0.18 0.95
Procurement Specialist 1.1 1.1 0.89 0.89 0.89 4.87
CM Specialist 1.1 1.1 1.1 1.1 1.1 5.5
QA Specialist 1.1 1.1 1.1 1.1 1.1 5.5
Total Staff Months 18.47 20.18 22.42 21.5 21.5 104.1
Table 16 – GPS Navigation Responsibility Matrix

53
Compile/Link/Runtime Responsibility Matrix
Requirements Design Coding Integration System Total
and Unit Testing Testing
Function Testing
Project Manager 0.8 0.8 0.8 0.8 0.8 4
Assistant Project Manager 0.8 0.8 0.8 0.8 0.8 4
Chief Programmer 1.1 1.1 1.2 0.89 0.89 5.18
Secretary 0.54 0.54 0.54 0.54 0.54 2.7
Systems Engineer/Analyst 2.1 2.1 1.9 1.5 1.5 9.1
Requirements Analyst 2.1 2.1 1.7 1.7 1.7 9.3
Technical Team Leader 0.54 2.1 1.9 1.9 1.5 7.94
Programmer 1.2 2.1 1.8 1.4 1.4 7.9
Tester 0.12 0.12 1.1 1.1 1.1 3.54
Help Desk Technician 0 0 0 0.54 0.54 1.08
RFID Integration Manager 0.34 0.34 0.54 0.54 0.54 2.3
RFID Engineers 0.34 0.34 0.54 0.54 0.54 2.3
Training Specialist 0 0 1.1 0.54 0.54 2.18
Documentation Specialist 0.2 0.2 0.45 0.3 0.3 1.45
HR Specialist 0.12 0.12 0.12 0.12 0.12 0.6
Procurement Specialist 0.12 1.1 1.1 0.34 0.34 3
CM Specialist 1 1 1 1 1 5
QA Specialist 1 1 1 0.89 0.89 4.78
Total Staff Months 12.42 15.86 17.59 15.44 15.04 76.4
Table 17 – Compile/Link/Runtime Responsibility Matrix

54
Language Independent Debugging & Testing Responsibility Matrix
Requirements Design Coding Integration System Total
and Unit Testing Testing
Function Testing
Project Manager 1.2 1.2 1.2 1.2 1.2 6
Assistant Project Manager 1.2 1.2 1.2 1.2 1.2 6
Chief Programmer 1.1 1.1 1.1 1.1 1.1 5.5
Secretary 0.8 0.8 0.8 0.8 0.8 4
Systems Engineer/Analyst 2.1 2.1 2.1 2.1 2.1 10.5
Requirements Analyst 2.3 2.3 1.8 1.8 1.8 10
Technical Team Leader 1.2 1.2 2 2 2 8.4
Programmer 1 1.8 2 2 2 8.8
Tester 0 0 1.2 1.2 1.2 3.6
Help Desk Technician 0 0 0 1 1 2
RFID Integration Manager 1.2 1.2 1.7 1.7 1.7 7.5
RFID Engineers 1.2 1.2 1.7 1.7 1.7 7.5
Training Specialist 0 0 0 1 1 2
Documentation Specialist 0.33 0.33 0.33 0.33 0.33 1.65
HR Specialist 0.34 0.34 0.34 0.34 0.34 1.7
Procurement Specialist 0.4 0.4 0.4 0.4 0.4 2
CM Specialist 1.1 0.9 0.9 0.9 0.9 4.7
QA Specialist 0.45 0.45 0.45 0.45 0.45 2.25
Total Staff Months 15.92 16.52 19.22 21.22 21.22 94.1
Table 18 – Language Independent Debugging & Testing Responsibility Matrix

55
Electronic Inventory & Tracking Responsibility Matrix
Requirements Design Coding Integration System Total
and Unit Testing Testing
Function Testing
Project Manager 6.1 6.1 6.1 6.1 6.1 30.5
Assistant Project Manager 6.1 6.1 6.1 6.1 6.1 30.5
Chief Programmer 7.5 7.5 8.5 8.4 8.4 40.3
Secretary 6.1 6.1 6.1 6.1 6.1 30.5
Systems Engineer/Analyst 7.8 8.9 6.7 6.7 6.7 36.8
Requirements Analyst 7.8 7.8 6.7 6.7 6.7 35.7
Technical Team Leader 7.5 7.5 8.5 8.4 8.4 40.3
Programmer 7.5 7.5 8.5 8.4 8.4 40.3
Tester 2.3 4.5 7.6 8.9 8.9 32.2
Help Desk Technician 0 0 2.3 4.5 4.5 11.3
RFID Integration Manager 7.5 7.5 8.5 7.5 7.5 38.5
RFID Engineers 7.5 7.5 8.5 7.5 7.5 38.5
Training Specialist 0 0 4.3 4.5 4.5 13.3
Documentation Specialist 4 4 4.5 4.5 4.5 21.5
HR Specialist 3.4 3.4 3.4 3.4 3.4 17
Procurement Specialist 3.4 4.5 5.5 4.5 4.5 22.4
CM Specialist 3.4 3.4 3.4 3.4 3.4 17
QA Specialist 3.4 3.4 3.4 3.4 3.4 17
Total Staff Months 91.3 95.7 108.6 109 109 514
Table 19 – Electronic Inventory & Tracking Responsibility Matrix

56
3.0 Managerial Process

3.1 Management Objectives and Priorities

In keeping with senior management’s directives to pursue opportunities within the


Department of Defense (DOD), the AWATS team is committed to providing the USMC
with a ground-breaking solution that will both augment and simplify current processes -
all while staying within budgetary constraints. AWATS is seen as the perfect opportunity
to enhance the XYZ System and provide ABC with a chance to shift its primary focus to
contracts within the DOD. By establishing new and more efficient internal practices with
AWATS, ABC further hopes to reduce future costs. In order to cut margins, senior
management has moved toward intensifying collaborative development,
software/hardware integration, and the construction of intelligent and extensible
information systems. Scheduling (time) is the highest priority for AWATS, followed by
technical specifications (features) and finally by financial considerations (budget).

3.1.1 Goals and Objectives

The AWATS project has been granted a charter and has been approved by the corporate
portfolio committee as a project which synchronizes perfectly with senior management’s
goals to provide unique solutions to the DOD. The AWATS project specific goals can be
distinguished into the following three groups: Strategic Corporate goals, IT and Related
goals, and Project Specific Tactical goals.

3.1.1.1 Strategic Corporate Goals

General objectives defined by senior management:

 Lower duration and cost of engineering software in order to be more competitive


 Locate new DOD business niches and develop product strategies for penetrating them
 Increase internal mobility and flexibility with respect to infrastructure to better
assimilate new and pertinent technologies
 Create products with more innovation, efficiency, and speed than our competitors
 Increase the emphasis on selling services along with our products
 Move away from paper trails and increase the efficiency of documentation and/or
business decisions within ABC

57
3.1.1.2 Information Technology (IT) and Related Goals

ABC will achieve the following internal information technology goals and institute them
throughout the company:

 Work toward a Level 3 CMMI development process capability while diligently


observing the current ABC CMMI Level 2
 Develop software in loosely coupled modules that are capable of being reused in
future projects
 Increase customer responsiveness in order to reduce operating costs and thereby
increase competitiveness
 Develop a strong hardware/software integration methodology in order to become an
agile provider of hardware-based (software driven) solutions
 Take full advantage of the Internet by developing solutions which can be accessed
across various viable platforms

3.1.1.3 Project Specific Tactical Goals

The following goals are AWATS-specific, and will be achieved by augmenting the
current XYZ System’s capabilities:

 AWATS will become the de facto standard across the USMC for weapon inventory
and accountability
 AWATS will serve to encourage the proposal of new and innovative solutions from
within ABC
 Once in place, AWATS will demonstrate ABC’s potential to rapidly assist its DOD
clients with technologically superior solutions and services
 AWATS will be the spearhead solution by which ABC shall penetrate the DOD
sector and showcase its capabilities in order to acquire new business opportunities
 The AWATS project will introduce and encourage several new internal business and
operational practices designed to streamline the project process, and make more
efficient and responsible use of ABC funds and reduce time-to-market estimates

3.1.2 Management Priorities

Management would like an end-product with no peers, created with innovative methods
and exceptional internal process controls, which will augment the existing XYZ System.
This will be accomplished with AWATS and its distributed and redundant architecture.
The USMC’s advance order of 750 client systems (and an ADR) makes this project a
corporate priority in order to satisfy the client’s needs on schedule and within budget
constraints.

58
3.2 Assumptions, Dependencies, and Constraints

Section 3.2 describes the assumptions, dependencies, and constraints of the AWATS
project. These factors weigh heavily on any modifications made to the application, and
should therefore be considered thoroughly before changes are made. Failure to do so may
delay or disrupt the project schedule and ABC’s ability to deliver the system.

3.2.1 Assumptions

The following assumptions have been made during the development of AWATS.

1) ABC will provide all necessary project resources including staff, software
development tools, hardware, network connectivity, budget resources, training, and any
necessary travel (which has not already been covered by the contract).

2) Senior Management will be involved in each major phase of AWATS development to


provide feedback and approval before a new phase is initiated.

3) The Universal 2007-A microprocessor (developed by Universal) will be delivered as


scheduled in February 2007.

4) All AWATS software (written in .NET 2.0) will be compatible with the Universal
2007-A microprocessor.

5) The kernel set of software will be backwards-compatible.

6) Hazards of Electromagnetic Radiation to Ordnance (HERO) certification has already


been obtained for AWATS via the Marine Corps System Command (SYSCOM).

7) A pilot project to determine a viable means of attaching passive RFID tags to weapons
has already been successfully completed.

3.2.2 Dependencies

The integrity of these dependencies is required in order for AWATS to be delivered on


time.

1) The Universal 2007-A microprocessor will be delivered and available for development
and integration testing. The processor will be available for use at the alpha and beta test
sites.

2) The kernel set of software developed by our System Software Department will be
ready for integration testing no later than February 2007.

59
3) Ivan Industries will deliver the Graphics Presentation software package as scheduled
and be ready for system integration and user acceptance testing

4) SYSCOM must appropriate funds to procure the necessary AWATS RFID hardware
no later than six months before the initial system delivery.

3.2.3 Constraints

The following factors constrain the scope and functionality of AWATS.

1) AWATS must operate seamlessly with System XYZ software with acceptable speed
and performance, coupled with highly secured inter-system communications and a
reliable/redundant data storage entity (the ADR).

2) AWATS must follow all mandated USMC guidelines for hardware emanating
electronic signals.

3) AWATS must have sufficient bandwidth and processing power (on both the client and
server sides) to meet a level of usability specified by SYSCOM.

4) AWATS must ensure that transactions are reliably (and securely) transmitted and
received.

5) The ADR must ensure that the central data repositories are redundantly maintained
and that incoming client messages are concurrently processed in a load-balanced manner.

3.3 Risk Management

This section describes the major risks associated with the AWATS project. Risk and
their corresponding mitigation strategies are described below.

The major risk of this project is the possibility of Ivan Industries not completing their
development work of the graphics package on schedule. The graphics package interfaces
with the word processing package for report generation. If this package is behind
schedule this will directly affect the AWATS entire project schedule; we cannot release
the project until the reporting capabilities are completed. Outlined in Section 3.4.3.2,
Ivan Industries team leads will submit weekly status reports (WSRs) to the AWATS
project management outlining their progress on the deliverable. Based on this
assessment, AWATS project management can choose to meet with the Ivan Industries
team leadership to discuss a strategy for getting back on schedule.

60
Risk ID Risk Description Risk Mitigation Strategy
R001 Ivan Industries not  Ivan Industries is required to
completing submit weekly status reports
development work on (WSRs) each week outlining
schedule their progress. AWATS
 Project Management will meet
with the Ivan Industries team
lead if development work falls
behind the schedule to discuss
strategy.
R002 Kernel is not  Ask kernel development team to
completed on schedule use an incremental SDLC.
R003 System Performance  Create integrated test-
is not good enough to environments to simulate the
sustain peak-load projected load of the system.
levels  Run load-testing software (e.g.
IBM Rational Tester) to
determine the bounds of the
system.

R004 System Failure  Implement redundancy at all


levels of the system: software,
hardware, and network.
 Create Disaster Recovery (DR)
plan with a DR site.
R005 Employee Turnover  Keep accurate and up to date
documentation of all
development.
 Adhere to the Tipping Point
Principcle - when an employee
leaves project management will
alert all other employees and
allow questions to be asked.

R006 Customer  Get sign offs on product reviews


Dissatisfaction from the customer.
R007 RFID hardware failing  Create a list of manufacturers
due to manufacturer that use the same standards and
standards and range range settings.
options  Insure RFID tags/readers are
only purchases from
manufacturers on the list of cross
compatibility.

Table 20 – Risk Mitigation Strategy

61
3.4 Monitoring and Controlling Mechanisms
3.4.1 Schedule and Budget

3.4.1.1 Schedule

The milestones and their corresponding reviews have been set out in the Process Model
Milestone Chart in Section 2.1.1. This combined with the Project Schedule in Section 5.5
will act as the baselines upon which the schedule will be monitored and controlled.

Each milestone review will be acted upon within a 5% time frame of the previous phase’s
duration before or after its appointed date in the Project Schedule. If this is not achieved
the Program Manager for Systems Software Development will be notified and a meeting
will be conducted with him, the AWATS project manager, and a client representative.
An action plan with a new date will come out of this meeting that will be signed by all
parties. If this date is missed, another meeting will be held with the AWATS project
manager, a client representative, and the Vice-President of the Software Division.
Another action plan with a new date will be created and signed by all attendees. The
action plan will also provide the customer with a percentage of the contract to be
refunded for each day that the project overruns this new date if that happens.

3.4.1.2 Budget

A total budgeted cost has been provided and broken down into work tasks, hardware,
software, staff, and documentation. The total budgeted cost has also been broken down
by milestones. The total amount spent will be divided by the total budgeted cost at each
milestone review and if this number exceeds a certain threshold (e.g. 1.5), senior
management will be alerted. A planning session will be held with senior management
and the AWATS PM to address the cost overrun and create an action plan to correct it for
the future. It should be noted that the threshold number decreases as the project goes on
to insure a tighter envelope of success when approaching the end of the project.

3.4.2 Quality Assurance

The Quality Assurance organization will provide the AWATS project staff and
management with objective insight into processes and their associated work products and
services to support delivery of high-quality products and services. In the end, QA will
help establish and improve the overall quality of our organization’s processes, work
products, and services. Therefore, we will be able to enrich ourselves to better serve our
clients by building better and more reliable products in the future.

For the AWATS project, the Quality Assurance functions got involved since the
beginning of the pilot system. Due to this early involvement, QA was able to assist the
System Requirement Analysis team to successfully gather the AWATS requirements

62
accurately and to strictly follow the GSA’s contractual submission procedures and
guidelines.

The QA staffs will continue to have an active role as one of the support functions
throughout the entire Software Development Lifecycle to ensure that the AWATS
product will be built right according to requirements, standards, and processes as well as
the system is the right system for the USMC. Even though the QA team interacts daily
with the AWATS project team, its staffs will report directly to the Senior Management of
the AWATS project. The QA members will objectively monitor and control the
processes implemented in the development lifecycle.

Besides reviewing all the user documentations, reports, SDFs, and test results, the
AWATS QA team also has the responsibility to sign off on these documents before
handling over to the program manager for approval.

3.4.3 Productivity

3.4.3.1 Productivity Assessment

Productivity Assessment for the AWATS system will be measured by monthly status
reviews (which may be held outside of scheduled times according to the needs of the
project), verification and validation techniques, documentation progress, unit testing
(software), live product testing (hardware), and evaluation.

3.4.3.2 Productivity Metrics

Productivity metrics for AWATS shall be measured on the component level. Productivity
metrics for documentation shall be measured in pages completed by an organizational
unit per staff month (where organizational units could be individual developers or even
the project as a whole). Since COTS and outsourced packages need to interoperate
seamlessly, metrics for documenting these interfaces and their integration will be the
responsibility of the programming teams. Quarterly reviews will focus on these metrics
and their metric methodologies. Documentation will be shared between the programming
teams and the COTS vendors. The outsourcing company shall follow an emergent metric
after it has been discussed, agreed upon, and falls in line with goals. This metric may be
replaced if another proves more useful at a later date.

Source Lines of Code (SLOC) produced per organization unit in a staff month shall be
the basis for deriving productivity metrics. Interfaces with software from other units will
also need a productivity metric (aligned with the interface documentation metric). As
suggested in the SEI-CM-12-1.1, at ftp://ftp.sei.cmu.edu/pub/education/cm12.pdf, titled
Software Metrics, the number of design changes, the number of code changes, and the
number of errors detected by code inspections will be counted and evaluated.

Productivity metrics for testing will be based on the amount of the test completed (where
the test is defined in the individual deliverable’s test plan. Emphasis must be placed on

63
the testing of interfaces between software units. The number of errors detected in
program tests will be counted, evaluated, and compared to the numbers collected for
coding design, changes, and inspections.

The SEI-CM-12-1.1 proposal “for counting the defects in a program, the following…
measures…”
 The number of design changes
 The number of errors detected by code inspections
 The number of errors detected in program tests
 The number of package code changes required was added for AWATS metrics

3.4.3.2 Methods of Collection

The work breakdown structure will be used to establish tasks on a package basis, and
these tasks will be broken down into component subtasks for teams to work on. Using
level of effort and time expended per team/module, data will be collected. For the
outsourced components, earned value metrics will be compared to the work breakdown
structure as well as the outsourcing company’s electronic billing system. The work of
each software development team will be evaluated at the monthly status meeting in order
to gauge areas which may need additional resources as well as those which are ahead of
schedule. Product teams are responsible for data collected while executing the work
breakdown structure. The product manager will keep a master copy using references to
indicate both the teams and the locations of their progress data. This may be used to
evaluate productivity at any time.

3.4.3.3 Organization of Reviews

Organization of reviews for the AWATS project will be conducted as follows:

 Team members will submit a Weekly Status Report (WSR) each Friday to their
team lead that outlines the work completed as well as outlines the work to be
completed in the following week.
 The outsourced team leads will submit a WSR each Friday to the Technical Team
Lead that outlines the work completed as well as outlines the work to be
completed in the following week.
 Team Leaders will meet with the Project Manager to discuss progress on their
team’s development (including requirements, coding, interface development, and
test plan). This will include weekly conference calls.
 The Project Manager will meet with upper management to discuss progress on the
AWATS system.
 The Project Manager will meet monthly via teleconference with the client to
discuss progress against the milestones.

At any stage, a member of the project management team (Project Manager, Assistant
Project Manager, team leaders, and RFID Integration Manager) can initiate a periodic

64
Productivity Review to measure the progress against the project’s Integrated Master
Schedule. If a Productivity Review reveals efforts are behind the estimated schedule or
over the estimated budget, an audit will be initiated. These audits will measure the
progress of each development team against the estimated schedule. In addition to
periodic Productivity Reviews there will be a more intensive review at the milestone
points defined for each component described below:

Component Name Requirements Preliminary Critical Test Deployment


Review Design Design Readiness Review
Review Review Review
General Purpose 10/25/2006 11/04/2006 11/29/2006 12/19/2006 01/20/2007
Database Package
Spreadsheet Package 10/09/2006 10/29/2006 11/28/2006 12/28/2006 01/28/2007
Configuration & 10/02/2006 10/20/2006 11/11/2006 12/01/2006 01/02/2007
Requirements
Management Package
Communication 05/01/2007 06/15/2007 07/09/2007 08/01/2007 08/27/2007
Package
Graphics Presentation 10/10/2006 11/29/2006 01/15/2007 02/15/2007 03/11/2007
Package
Word Processing 02/15/2007 03/01/2007 04/18/2007 05/10/2007 06/01/2007
Package
Project Manager’s 09/07/2006 09/15/2006 10/01/2006 10/25/2006 11/15/2006
Package
GPS Navigation 12/28/2006 01/15/2007 02/15/2007 03/01/2007 03/28/2007
Package
Compile/Link/Runtime 09/13/2006 09/25/2006 10/02/2006 10/20/2006 11/05/2006
Packages for MS
Visual Studio
Language Independent 09/01/2006 09/19/2006 10/26/2006 12/01/2006 01/04/2007
Debugging & Testing
Package
Electronic Inventory & 10/04/2006 11/30/2006 01/11/2007 04/29/2007 06/13/2007
Tracking Package
Table 21 – Productivity Review Schedule

Please note that for all packages, 5 days are allotted for each review. Additionally, there
are no milestone reviews designated between coding, and integration and unit testing.

65
This table from http://www.processimpact.com/articles/metrics_primer.html suggests the
following groups have these metrics:

Group Appropriate Metrics


 Work effort distribution
Individual Developers  Estimated vs. actual task duration and effort
 Code covered by unit testing
 Number of defects found by unit testing

 Code and design complexity


 Product size
Project Teams  Work effort distribution
 Requirements status (number approved, implemented, and
verified)
 Percentage of test cases passed
 Estimated vs. actual duration between major milestones
 Estimated vs. actual staffing levels
 Number of defects found by integration and system testing
 Number of defects found by inspections
 Defect status
 Requirements stability

 Number of tasks planned and completed


 Released defect levels
Development  Product development cycle time
Organization  Schedule and effort estimating accuracy
 Reuse effectiveness

 Planned and actual cost


Table 22 – Group’s Appropriate Metrics

66
Lifecycle Phases
Req't Design CUT IT ST PDSS User
Class Operation
Schedule Attribut Position in Position in Position in Position in Position in Position in Position in
e Gantt Chart Gantt Gantt Chart Gantt Gantt Gantt Chart Gantt
Chart Chart Chart Chart
Derived Team Team Team Team Team Team Team
Reporting on Reporting Reporting Reporting Reporting Reporting Reporting
Work Items on Work on Work on Work on Work on Work on Work
Completed Items Items Items Items Items Items
Completed Completed Completed Completed Completed Completed
Decisio 5% Behind 5% Behind 10% 5% Behind 5% Behind 5% Behind 10%
n Schedule Schedule Behind Schedule Schedule Schedule Behind
Indicato Schedule Schedule
r
Cost Attribut Total Actual Total Total Total Total Total Total
e Cost vs. Actual Actual Cost Actual Actual Actual Cost Actual
Budgeted Cost vs. vs. Cost vs. Cost vs. vs. Cost vs.
Cost Budgeted Budgeted Budgeted Budgeted Budgeted Budgeted
Cost Cost Cost Cost Cost Cost
Derived Timesheets, Timesheets Timesheets, Timesheets Timesheets Timesheets Timesheets
Purchase , Purchase Purchase , Purchase , Purchase , Purchase , Purchase
Orders Orders Orders Orders Orders Orders Orders
Decisio Over 1.5 Over 1.3 Over 1.2 Over 1.2 Over 1.1 Over 1.1 Over 1.05
n
Indicato
r
Quality Attribut New New Discovered Discovered Discovered Avg. Time Survey of
e Requirement Design vs. Closed vs. Closed vs. Closed it takes to Customers
s vs. Models vs. Defects Defects Defects install Rating of
Completed Completed system Quality (1 -
Requirement Design successfull 5 Stars)
s Models y
Derived Requirement Software Bug Bug Bug Deploymen Survey
s Design Tracking Tracking Tracking t Team every
Specification Document System System System month of
sample of
customers
Decisio Over 1.5 Over 1.5 Over 1.5 Over .5 Over .1 Over 3 Below 4
n Days
Indicato
r
Productivit Attribut New Design Source Successful Amount of # Sites # of
y e Requirement Models Lines of Builds per Defects Installed Trouble
s Completed Completed Code per Day Found per per Week Tickets
per Week per Week Week per Day Handled
Programme Per Week
r
Derived Requirement Software Version QA Team Testing Deploymen Help Desk
s Design Tracking Team t Team Manager
Specification Document System
Decisio Below 20 Below 10 Below 200 Below 1 Below 10 Below 2 Below 100
n
Indicato

67
r
Progress Attribut New New Discovered Discovered Discovered # Sites Survey of
e Requirement Design vs. Closed vs. Closed vs. Closed Installed Customers
s vs. Models vs. Defects Defects Defects vs. Total Rating of
Completed Completed Sites Quality (1 -
Requirement Design 5 Stars)
s Models
Derived Requirement Software Bug Bug Bug Deploymen Survey
s Design Tracking Tracking Tracking t Team every
Specification Document System System System month of
sample of
customers
Decisio Over 1.5 Over 1.5 Over 1.5 Over .5 Over .1 Below .2 Below 4
n
Indicato
r
Table 23 – Class vs. Phase Measurements

68
Attribute
Cost Schedule Quality Productivity Progress
Characteristic
1. System:
Performance
Direct/Indirect
Requirements, 2. Project:
Costs charged to Technological
Work Functional components
Attribute the project. Effectiveness,
Packages Correctness, progression;
collected Overhead Cost End-Users
Completed Supportability/ Rework Efforts;
Total Project Satisfaction
Maintainability Amount of
Cost
reuse; SLOC
for effort
Developers,
Developers,
Development Development System End-Users,
Source of data Testers, End-
Team Team Administrators, Help Desk
Users
End-Users

1. System: Bi-
Collection Weekly
Bi-Weekly Bi-Weekly Bi-Weekly Daily
Schedule 2. Project:
Weekly

1. System:
Utilization,
Surveys, Surveys,
Throughput;
Testing, Quality Audits,
Timing;
Trouble Customer
Timesheets Individual Failures; Fault
How is data Reports, Form Complaints,
Overhead Rates Team Tolerance
collected Peer Reviews, Performance
Actual Positions Progress 2. Project:
(e.g.: tool) Time to Ratings,
Actual ODC Report Components
Restore, Support Time,
Status; Change
Maintenance Requirements
Request
Actions Coverage
Status; Action
Item Status
Software
Software
AWATS Project Development
Where is Discrepancy
Budget Folder (SDFs),
attribute MS Project Report
Cost Analysis Problem Remedy
stored (data OpenOffice Software
Fund spent Reports, Audit
base) Development
Remaining fund logs for
Folder
systems
Project
Software
Manager,
Quality
Operation
Operation Assurance,
Who verifies Test Manager,
Manager/Project Test Team, QA Test Engineer,
attribute Manager Technical
Control Technical
Support
Support
Supervisor,
Supervisor
Test Engineer
Table 24 – Project Measures Matrix

69
3.4.4 Progress

3.4.4.1 Progress Assessment


The AWATS project track task progress at all levels: project level, team level, and
individual levels. Each week, individual team held meeting chaired by the team leader
reviewing all major tasks against existing schedules, including individual task and
individual team member’s schedule milestones. Then, all team leaders attend a weekly
CCB meeting chaired by the Project Manager to assess the project, including upcoming
release’s progress such as assessing the progress on the Alpha Testing phase. In addition,
the AWATS Project Manager and all team leaders will head an Internal Project Review
(IPR) with the Program/Business Manager each quarter. At the IPRs, it is to assess the
strategic direction of the project as well as to address customer satisfactions and
concerns.

3.4.4.2 Progress Metrics


Progress Metrics on the AWATS project will be generated and tracked on the project
schedules on Microsoft Project 2003 employed in one of the COTS package. The
“Software Metric Guide” published by the University of Southern California34 will be
used to provide guidance in gathering information for the progress metrics.

Indicator Category Management Insight Indicators


Progress Provides information on how well 1. Actual versus planned task
the project is performing with completions.
respect to its schedule 2. Actual versus planned
durations.
Table 25 – Metrics Set for AWATS Project

The AWATS project schedules are to illustrate the “Actual Duration” and “Actual
Finish” column demonstrating the actual versus planned task completion as well as the
actual versus planned durations on each task. The following MS Project 2003 figure is to
provide an example on how the AWATS project will keep track of each task progress.

34
“Software Metrics Guide” - http://sunset.usc.edu/classes/cs577b_2001/metricsguide/metrics.html#p31

70
Figure 12 – Progress Tracking

All AWATS schedules have these progress indicators to provide information on how well
the project is performing with respect to planned task completions and keeping schedule
commitments. Tasks are scheduled and then progress is tracked to the schedules. Metrics
are collected for the activities and milestones identified in the project schedules as shown
in Task 79 in the above Figure. Metrics on actual completions are compared to those of
planned completions to determine whether there are deviations to the plan. The difference
between the actual and planned completions indicates the deviations from the plan.
The USC’s Guides recommends “each project identifies tasks for which progress metrics
will be collected.

71
The completion criteria for each task must be well defined and measurable. The project
should establish range limits (thresholds) on the planned task progress for the project.
The thresholds are used for management of software development risk.”
The following Figure borrowed from USC’s Guide “depicts the cumulative number of
planned and actual completions (or milestones) over time. Note that this chart is generic,
and each project will substitute specific tasks (units, milestones, SLOCs, etc. ).
Additionally, each project is expected to produce multiple progress charts for different
types of tasks, different teams, etc.”

Figure 13 - Progress Indicator Example

The AWATS project will export the percentages on cumulative task completed in MS
Project schedule to this MS Excel chart.

3.5 Staffing Plan


3.5.1 Obtaining Personnel

When filling a position for the AWATS project, the following methods will be used. The
methods are listed in order of preference.

 Consider future employees based on employee referrals.

72
 Consider future employees based on internal transfers.
 Consider future employees based on previous work experience.
 Utilize the service of a recruiter to aid in the employee search.

3.5.2 Training

Once employees are acquired for the project they must be adequately trained for the
project. The following are training incentives for employees.
 Provide career training to all employees. Training will be reimbursed up to
$2,500 per year and includes courses taught by company staff and external
courses taken at an approved site.
 Provide tuition reimbursement for all employees pursing career relevant higher
education degrees. Tuition will be reimbursed up to $5,000 per year.
 Provide Mentoring Circles and other professional forums for all employees.

3.5.2 Retaining

Once employees are hired into the company, efforts will be made to keep them
challenged - and compensated accordingly. The methods are listed below.
 Provide recognition programs which include cash awards or credits towards the
company’s store.
 Provide annual 360 degree assessments and annual percent raises.
 Compensate employees for overtime work by time and a half.
 Provide project team building activities such as quarterly out-of-office events.

3.5.4 Phasing out of Personnel

 Provide a phasing out plan for employees. Plan includes a time line for phasing
out employees close to the end of the project. Plan will include avenues for
transferring employees to other projects within the company.
 Advertise in advance the length of the project and time required for each
employee

73
Project Title Degree/Major Experience Level Skill Salary Bid Labor Source
(Years) Written Level (Annual) Price Category
Project Manager MS Information 10 Senior 4 $110,000 $330,000 45 Transfer
Systems, BS
MIS
Assistant Project MS 7 Mid Level 3 $98,000 $294,000 40 Transfer
Manager Management in
Information
Systems, BS
Information
Systems
Chief BS Computer 5 Senior 4 $90,000 $270,000 40 Transfer
Programmer Science
Secretary Vocational 4 Mid Level 3 $45,000 $135,000 41 Transfer
Training
Systems MS Systems 5 Mid Level 4 $75,000 $225,000 40 Advertise
Engineer/Analyst Engineering, BS
Computer
Science
Requirements BS Information 4 Mid Level 3 $60,000 $180,000 39 Transfer
Analyst Technology
Technical Team BS Computer 4 Mid Level 3 $80,000 $240,000 39 Transfer
Leader Science
Programmer BS Computer 2 Junior 2 $55,000 $165,000 38 Transfer
Science
Tester BS Computer 2 Junior 1 $60,000 $180,000 37 Transfer
Science
Help Desk BS Information 3 Junior 2 $55,000 $165,000 40 Transfer
Technician Systems
RFID Integration MS Systems 7 Mid Level 4 $85,000 $255,000 44 Advertise

74
Manager Engineering, BS
Computer
Science
RFID Engineers BS Systems 4 Mid Level 3 $70,000 $210,000 43 Advertise
Engineering
Training BS Information 4 Mid Level 3 $58,000 $174,000 43 Transfer
Specialist Technology
Documentation BS Technical & 3 Mid Level 3 $52,000 $156,000 42 Transfer
Specialist Scientific
Communications
HR Specialist BS 5 Mid Level 2 $51,000 $153,000 39 Transfer
Communications
Procurement BS Information 4 Mid Level 2 $60,000 $180,000 39 Transfer
Specialist Systems
CM Specialist BS MIS 4 Mid Level 3 $78,000 $234,000 39 Transfer
QA Specialist BS Computer 5 Senior 4 $88,000 $264,000 42 Transfer
Science
Table 26 – Staff, Expertise, Recruitment, and Utilization Sheet

75
  2006       2007                       2008
Totals Sep- Oct- Nov- Dec- Jan- Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov- Dec- Jan-
06 06 06 06 07 07 07 07 07 07 07 07 07 07 07 07 08
Project Management 8 9 12 14 14 13 10 7 7 7 10 11 15 13 13 13 12
Project Control 6 11 15 17 16 15 12 8 7 8 13 15 16 15 16 15 13
Programmer 0 4 14 18 19 20 15 11 10 13 21 22 29 29 25 22 26
Analyst 21 30 35 31 28 16 10 7 9 10 10 11 7 7 8 6 4
Tester 0 0 0 10 12 12 8 8 5 7 8 12 12 12 12 13 11
Totals 35 54 76 90 89 76 55 41 38 45 62 71 79 76 74 69 66
                                   
Database Package Sep- Oct- Nov- Dec- Jan- Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov- Dec- Jan-
06 06 06 06 07 07 07 07 07 07 07 07 07 07 07 07 08
Project Management 1 1 2 2 2 3 4 4 4 3 3 4 4 4 2 2 2
Project Control 0 1 2 3 3 3 4 3 3 3 3 4 4 4 4 3 2
Programmer 0 0 0 0 4 4 6 5 5 5 6 6 6 5 4 4 3
Analyst 1 2 4 4 5 5 5 4 4 5 5 4 3 2 1 0 0
Tester 0 0 0 0 1 1 2 3 4 4 5 5 4 5 5 4 4
Totals 2 4 8 9 15 16 21 19 20 20 22 23 21 20 16 13 11
                                   
Spreadsheet Package Sep- Oct- Nov- Dec- Jan- Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov- Dec- Jan-
06 06 06 06 07 07 07 07 07 07 07 07 07 07 07 07 08
Project Management 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 2 2
Project Control 0 0 0 0 0 0 0 0 0 0 0 1 1 2 2 2 2
Programmer 0 0 0 0 0 0 0 0 0 0 0 0 3 3 5 3 3
Analyst 0 0 0 0 0 0 0 0 0 0 0 3 1 2 1 0 0
Tester 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Totals 0 0 0 0 0 0 0 0 0 0 0 4 6 8 9 7 7
                                   
Requirements
Management Package Sep- Oct- Nov- Dec- Jan- Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov- Dec- Jan-
06 06 06 06 07 07 07 07 07 07 07 07 07 07 07 07 08
Project Management 1 2 3 4 4 3 2 0 0 0 0 0 0 0 0 0 0
Project Control 1 2 3 3 4 3 3 0 0 0 0 0 0 0 0 0 0
Programmer 0 0 4 5 6 4 2 0 0 0 0 0 0 0 0 0 0
Analyst 4 5 5 4 2 1 0 0 0 0 0 0 0 0 0 0 0
Tester 0 0 0 4 3 2 1 1 0 0 0 0 0 0 0 0 0
Totals 6 9 15 20 19 13 8 1 0 0 0 0 0 0 0 0 0
                                   
Secure
Communications Sep- Oct- Nov- Dec- Jan- Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov- Dec- Jan-
Package 06 06 06 06 07 07 07 07 07 07 07 07 07 07 07 07 08
Project Management 0 0 0 0 0 0 0 0 1 1 2 1 3 2 1 1 0

76
Project Control 0 0 0 0 0 0 0 0 1 1 2 2 3 2 1 1 0
Programmer 0 0 0 0 0 0 0 0 0 1 5 4 2 1 1 0 0
Analyst 0 0 0 0 0 0 0 0 3 4 3 1 0 0 0 0 0
Tester 0 0 0 0 0 0 0 0 0 0 0 2 3 3 2 2 0
Totals 0 0 0 0 0 0 0 0 5 7 12 10 11 8 5 4 0
                                   
Graphic Presentation
Package Sep- Oct- Nov- Dec- Jan- Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov- Dec- Jan-
06 06 06 06 07 07 07 07 07 07 07 07 07 07 07 07 08
Project Management 0 0 0 0 0 0 0 0 0 0 2 2 3 1 3 2 1
Project Control 0 0 0 0 0 0 0 0 0 0 3 3 3 2 3 2 1
Programmer 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Analyst 0 0 0 0 0 0 0 0 0 0 2 3 3 3 2 0 0
Tester 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 2 1
Totals 0 0 0 0 0 0 0 0 0 0 7 8 9 7 9 6 3
                                   
Word Processing Sep- Oct- Nov- Dec- Jan- Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov- Dec- Jan-
Package 06 06 06 06 07 07 07 07 07 07 07 07 07 07 07 07 08
Project Management 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 2 3
Project Control 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 2 3
Programmer 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 5
Analyst 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4 6 4
Tester 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Totals 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6 10 15
                                   
Project Manager Sep- Oct- Nov- Dec- Jan- Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov- Dec- Jan-
Package 06 06 06 06 07 07 07 07 07 07 07 07 07 07 07 07 08
Project Management 1 1 2 2 3 2 1 0 0 0 0 0 0 0 0 0 0
Project Control 1 1 2 3 3 3 1 1 0 0 0 0 0 0 0 0 0
Programmer 0 2 5 6 3 4 0 0 0 0 0 0 0 0 0 0 0
Analyst 2 5 4 4 3 0 0 0 0 0 0 0 0 0 0 0 0
Tester 0 0 0 0 3 3 2 1 0 0 0 0 0 0 0 0 0
Totals 4 9 13 15 15 12 4 2 0 0 0 0 0 0 0 0 0
                                   
GPS Navigation Sep- Oct- Nov- Dec- Jan- Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov- Dec- Jan-
Package 06 06 06 06 07 07 07 07 07 07 07 07 07 07 07 07 08
Project Management 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Project Control 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Programmer 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Analyst 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Tester 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Totals 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0

77
                                   
Compile/Link/Runtim Sep- Oct- Nov- Dec- Jan- Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov- Dec- Jan-
e Package 06 06 06 06 07 07 07 07 07 07 07 07 07 07 07 07 08
Project Management 1 1 2 2 2 2 1 1 0 0 0 0 0 0 0 0 0
Project Control 0 1 2 2 2 2 1 1 0 0 0 0 0 0 0 0 0
Programmer 0 0 3 5 4 6 4 3 0 0 0 0 0 0 0 0 0
Analyst 3 3 4 5 3 0 0 0 0 0 0 0 0 0 0 0 0
Tester 0 0 0 3 3 5 3 3 0 0 0 0 0 0 0 0 0
Totals 4 5 11 17 14 15 9 8 0 0 0 0 0 0 0 0 0
                                   
Debugging/Testing Sep- Oct- Nov- Dec- Jan- Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov- Dec- Jan-
Package 06 06 06 06 07 07 07 07 07 07 07 07 07 07 07 07 08
Project Management 1 1 1 2 1 0 0 0 0 0 0 0 0 0 0 0 0
Project Control 1 2 2 2 1 1 0 0 0 0 0 0 0 0 0 0 0
Programmer 0 2 2 2 0 0 0 0 0 0 0 0 0 0 0 0 0
Analyst 3 3 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Tester 0 0 0 3 2 1 0 0 0 0 0 0 0 0 0 0 0
Totals 5 8 7 9 4 2 0 0 0 0 0 0 0 0 0 0 0
                                   
Electronic Inventory Sep- Oct- Nov- Dec- Jan- Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov- Dec- Jan-
Package 06 06 06 06 07 07 07 07 07 07 07 07 07 07 07 07 08
Project Management 3 3 2 2 2 3 2 2 2 3 3 4 4 5 5 4 4
Project Control 3 4 4 4 3 3 3 3 3 4 5 5 5 5 5 5 5
Programmer 0 0 0 0 2 2 3 3 5 7 10 12 18 20 15 15 15
Analyst 8 12 16 14 15 10 5 3 2 1 0 0 0 0 0 0 0
Tester 0 0 0 0 0 0 0 0 1 3 3 5 5 3 4 5 6
Totals 14 19 22 20 22 18 13 11 13 18 21 26 32 33 29 29 30
                                   
                                   
                                   
                                   
Project Management: Sep- Oct- Nov- Dec- Jan- Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov- Dec- Jan-
06 06 06 06 07 07 07 07 07 07 07 07 07 07 07 07 08
Project Manager 4 4 4 5 5 6 5 4 4 5 5 5 5 6 7 5 5
Assistant Project
Manager 4 4 4 6 4 6 5 4 4 3 3 6 4 6 6 5 5
Secretary 3 3 3 4 4 4 4 3 3 4 4 4 4 3 3 4 4
HR Specialist 2 2 2 2 4 4 5 2 2 4 4 2 5 6 5 4 4
RFID Integration
Manager 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
Technical Team Leader 9 10 9 9 11 9 6 10 10 8 8 9 9 10 9 9 9
Totals 23 24 23 27 29 30 26 24 24 25 25 27 28 32 31 28 28

78
Table 27 – AWATS Staffing (September 2006 – January 2008)

Totals Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov-
08 08 08 08 08 08 08 08 08 08
Project Management 9 6 6 3 4 4 2 1 0 0
Project Control 13 11 9 6 5 5 4 3 0 0
Programmer 29 24 18 11 8 4 3 0 0 0
Analyst 3 5 5 4 4 3 2 0 0 0
Tester 12 11 6 5 4 4 3 2 0 0
Totals 66 57 44 29 25 20 14 6 0 0
                     
Database Package Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov-
08 08 08 08 08 08 08 08 08 08
Project Management 1 0 0 0 0 0 0 0 0 0
Project Control 1 0 0 0 0 0 0 0 0 0
Programmer 2 0 0 0 0 0 0 0 0 0
Analyst 0 0 0 0 0 0 0 0 0 0
Tester 3 3 0 0 0 0 0 0 0 0
Totals 7 3 0 0 0 0 0 0 0 0
                     
Spreadsheet Package Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov-
08 08 08 08 08 08 08 08 08 08
Project Management 1 1 1 0 0 0 0 0 0 0
Project Control 2 2 1 1 0 0 0 0 0 0
Programmer 2 1 0 0 0 0 0 0 0 0
Analyst 0 0 0 0 0 0 0 0 0 0
Tester 1 2 1 1 1 0 0 0 0 0
Totals 6 6 3 2 1 0 0 0 0 0
                     
Requirements
Management Package Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov-
08 08 08 08 08 08 08 08 08 08
Project Management 0 0 0 0 0 0 0 0 0 0
Project Control 0 0 0 0 0 0 0 0 0 0
Programmer 0 0 0 0 0 0 0 0 0 0
Analyst 0 0 0 0 0 0 0 0 0 0
Tester 0 0 0 0 0 0 0 0 0 0
Totals 0 0 0 0 0 0 0 0 0 0
                     

79
Secure
Communications Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov-
Package 08 08 08 08 08 08 08 08 08 08
Project Management 0 0 0 0 0 0 0 0 0 0
Project Control 0 0 0 0 0 0 0 0 0 0
Programmer 0 0 0 0 0 0 0 0 0 0
Analyst 0 0 0 0 0 0 0 0 0 0
Tester 0 0 0 0 0 0 0 0 0 0
Totals 0 0 0 0 0 0 0 0 0 0
                     
Graphic Presentation
Package Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov-
08 08 08 08 08 08 08 08 08 08
Project Management 1 1 0 0 0 0 0 0 0 0
Project Control 2 1 0 0 0 0 0 0 0 0
Programmer 0 0 0 0 0 0 0 0 0 0
Analyst 0 0 0 0 0 0 0 0 0 0
Tester 1 1 0 0 0 0 0 0 0 0
Totals 4 3 0 0 0 0 0 0 0 0
                     
Word Processing Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov-
Package 08 08 08 08 08 08 08 08 08 08
Project Management 2 1 1 0 0 0 0 0 0 0
Project Control 3 3 2 1 0 0 0 0 0 0
Programmer 5 4 3 0 0 0 0 0 0 0
Analyst 3 2 1 0 0 0 0 0 0 0
Tester 4 3 3 2 0 0 0 0 0 0
Totals 17 13 10 3 0 0 0 0 0 0
                     
Project Manager Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov-
Package 08 08 08 08 08 08 08 08 08 08
Project Management 0 0 0 0 0 0 0 0 0 0
Project Control 0 0 0 0 0 0 0 0 0 0
Programmer 0 0 0 0 0 0 0 0 0 0
Analyst 0 0 0 0 0 0 0 0 0 0
Tester 0 0 0 0 0 0 0 0 0 0
Totals 0 0 0 0 0 0 0 0 0 0
                     
GPS Navigation Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov-
Package 08 08 08 08 08 08 08 08 08 08
Project Management 0 0 1 1 2 2 0 0 0 0
Project Control 0 0 1 1 2 2 1 1 0 0

80
Programmer 0 0 0 3 4 2 2 0 0 0
Analyst 0 3 4 4 4 3 2 0 0 0
Tester 0 0 0 0 2 3 2 1 0 0
Totals 0 3 6 9 14 12 7 2 0 0
                     
Compile/Link/Runtim Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov-
e Package 08 08 08 08 08 08 08 08 08 08
Project Management 0 0 0 0 0 0 0 0 0 0
Project Control 0 0 0 0 0 0 0 0 0 0
Programmer 0 0 0 0 0 0 0 0 0 0
Analyst 0 0 0 0 0 0 0 0 0 0
Tester 0 0 0 0 0 0 0 0 0 0
Totals 0 0 0 0 0 0 0 0 0 0
                     
Debugging/Testing Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov-
Package 08 08 08 08 08 08 08 08 08 08
Project Management 0 0 0 0 0 0 0 0 0 0
Project Control 0 0 0 0 0 0 0 0 0 0
Programmer 0 0 0 0 0 0 0 0 0 0
Analyst 0 0 0 0 0 0 0 0 0 0
Tester 0 0 0 0 0 0 0 0 0 0
Totals 0 0 0 0 0 0 0 0 0 0
                     
Electronic Inventory Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov-
Package 08 08 08 08 08 08 08 08 08 08
Project Management 4 3 3 2 2 2 2 1 0 0
Project Control 5 5 5 3 3 3 3 2 0 0
Programmer 20 19 15 8 4 2 1 0 0 0
Analyst 0 0 0 0 0 0 0 0 0 0
Tester 3 2 2 2 1 1 1 1 0 0
Totals 32 29 25 15 10 8 7 4 0 0
                     
                     
                     
                     
Project Management: Feb- Mar- Apr- May- Jun- Jul- Aug- Sep- Oct- Nov-
08 08 08 08 08 08 08 08 08 08
Project Manager 5 4 4 5 4 4 4 5 4 4
Assistant Project
Manager 6 5 5 3 5 5 4 5 4 4
Secretary 4 4 4 3 4 4 3 4 3 3
HR Specialist 2 2 2 2 2 2 3 3 2 2

81
RFID Integration
Manager 1 1 1 1 1 1 1 1 1 1
Technical Team Leader 9 9 9 11 9 9 9 7 8 8
Totals 27 25 25 25 25 25 24 25 22 22
Table 28 – AWATS Staffing (February 2008 – November 2008)

82
Database Package

25

20

15
Staff

10

0
Sep-06 Dec-06 Mar-07 Jun-07 Sep-07 Dec-07 Mar-08 Jun-08 Sep-08
Month

Project Management Project Control Programmer Analyst Tester

Figure 14 - Database Staffing Over Time

83
Spreadsheet Package

10

6
Staff

0
Sep-06 Nov-06 Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08
Month

Project Management Project Control Programmer Analyst Tester

Figure 15 - Spreadsheet Staffing Over Time

84
Requirements Management Package

25

20

15
Staff

10

0
Sep-06 Nov-06 Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08
Month

Project Management Project Control Programmer Analyst Tester

Figure 16 - Requirements Management Staffing Over Time

85
Secure Communications Package

14

12

10

8
Staff

0
Sep-06 Nov-06 Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08
Month

Project Management Project Control Programmer Analyst Tester

Figure 17 - Secure Communications Staffing Over Time

86
Graphic Presentation Package

10

6
Staff

0
Sep-06 Nov-06 Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08
Month

Project Management Project Control Programmer Analyst Tester

Figure 18 - Graphics Presentation Staffing Over Time

87
Word Processing Package

18

16

14

12

10
Staff

0
Sep-06 Nov-06 Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08
Month

Project Management Project Control Programmer Analyst Tester

Figure 19 - Word Processing Staffing Over Time

88
Project Manager Package

16

14

12

10
Staff

0
Sep-06 Nov-06 Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08
Month

Project Management Project Control Programmer Analyst Tester

Figure 20 - Project Management Staffing Over Time

89
GPS Navigation Package

16

14

12

10
Staff

0
Sep-06 Nov-06 Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08
Month

Project Management Project Control Programmer Analyst Tester

Figure 21 - GPS Navigation Staffing Over Time

90
Compile/Link/Runtime Package

18

16

14

12

10
Staff

0
Sep-06 Nov-06 Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08
Month

Project Management Project Control Programmer Analyst Tester

Figure 22 - Compile/Link/Runtime Staffing Over Time

91
Debugging/Testing Package

10

6
Staff

0
Sep-06 Nov-06 Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08
Month

Project Management Project Control Programmer Analyst Tester

Figure 23 - Debugging/Testing Staffing Over Time

92
Electronic Inventory Package

35

30

25

20
Staff

15

10

0
Sep-06 Nov-06 Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08
Month

Project Management Project Control Programmer Analyst Tester

Figure 24 - Electronic Inventory Staffing Over Time

93
Management Staff by Labor Category

35

30

25

20
Staff

15

10

0
Sep-06 Nov-06 Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08
Month

Project Manager Assistant Project Manager Secretary HR Specialist RFID Integration Manager Technical Team Leader

Figure 25 - Management Staff Over Time

94
Management Staff by Labor Category

70

60

50

40
Staff

30

20

10

0
Sep-06 Nov-06 Jan-07 Mar-07 May-07 Jul-07 Sep-07 Nov-07 Jan-08 Mar-08 May-08 Jul-08 Sep-08 Nov-08
Month

Project Manager Assistant Project Manager Secretary HR Specialist RFID Integration Manager Technical Team Leader Totals

Figure 26 - Overall Staff Over Time

95
Overall Staff Summary by Package

100

90
Electronic Inventory Package

Debugging/Testing Package
80
Compile/Link/Runtime Package
70 GPS Navigation Package

Project Manager Package


60
Word Processing Package
Staff

50 Graphic Presentation Package

Secure Communications
40 Package
Requirements Management
Package
30 Spreadsheet Package

Database Package
20

10

0
Sep-06 Jan-07 May-07 Sep-07 Jan-08 May-08 Sep-08
Month

Figure 27 - Overall Staff by Application Over Time

96
4.0 Technical Process

4.1 Methods, Tools, and Techniques


The actual methods and standards will vary as deliverables proceed through each
development phase of the waterfall software life cycle. The following outline serves as a
governing guide of currently understood best practices and standards. IT describes
applicable tools, methods, and standards wit the understanding the local interpretations
and deviations from those standards may apply during each phase and for each type of
software application (custom, reuse, and COTS):

1. Requirements Processes
1.1. Applicable Tools
1.1.1. Rational Rose
1.1.2. Rational Requisite Pro
1.1.3. Rational Unified Process
1.1.4. Microsoft Team Foundation Server
1.2. Methods
1.2.1. UML
1.2.2. Use Cases
1.2.3. Interactive storyboards
1.3. Standards
1.3.1. IEEE 830, Recommended Practices for Software Requirements
Specifications
1.3.2. IEEE 1062, Recommended Practices for Software Acquisition
1.3.3. IEEE 1420, Standard for Information Technology – Software Reuse

2. Design
2.1 Tools
2.1.1 Microsoft SQL Management Studio
2.1.2 Rational Rose
2.1.3 Rational Unified Process
2.1.4 CASE Tools
2.2 Methods
2.2.1 Object Oriented Design
2.2.2 ER Diagrams
2.2.3 Relational Database Design
2.2.4 UML
2.2.5 XML
2.2.6 Asynchronous Design Methodology
2.3 Standards
2.3.1 IEEE 1016, Recommended Practice for Software Design
Descriptions
2.3.2 European Computer Manufacture’s Association (ECMA) TR/55,
Reference Model fro Frameworks of Software Engineering Environments

97
2.3.3 IEEE 1348, Recommended Practices for Adoption of CASE tools
2.3.4 IEEE 1420, Guide for Information Technology – Software Reuse
Concept of Operation for Interoperating Reuse Libraries

3. Code
3.1 Tools
3.1.1 C#
3.1.2 Visual Studio 2005 Team Suite
3.1.3 Rational Clearcase
3.1.4 Rational Unified Process
3.2 Methods
3.2.1 Test Driven Development
3.2.2 Pair Programming
3.3 Standards
3.3.1 Microsoft .NET Framework Design Guidelines
3.3.2 Code Conventions for the Java Programming Language (SUN
Microsystems, JAVA Code)
3.3.3 IEEE 1028, Standard for Software Reviews and Audits
3.3.4 IEEE 730, Standard for Software Quality Assurance Plans
3.3.5 IEEE 1298, Standard for Software Quality Management Systems

4. Unit Test
4.1 Tools
4.1.1 NUnit
4.1.2 Rational Robot
4.1.3 Rational Clearcase
4.1.4 Rational Unified Process
4.1.5 Rational TestManager
4.2 Methods
4.2.1 Test Driven (NAnt)
4.2.2 Testing with sample data
4.2.3 Testing with expected high volume data sets
4.3 Standards
4.3.1 IEEE 1008, Standard for Software Unit Testing
4.3.2 IEEE 1012, Standard for Software Verification and Validation Plans
4.3.3 IEEE 829, Standard for Software Test Documentation

5. Integration Test
5.1 Tools
5.1.1 Rational TestManager
5.1.2 Rational Robot
5.1.3 Rational Clearcase
5.1.4 Rational Unified Process
5.2 Methods
5.2.1 Top-down testing

98
5.2.2 White Box testing
5.3 Standards
5.3.1 Open Process Framework (OPF) Testing
5.3.2 IEEE 829, Standard for Software Test Documentation

6. System Test
6.1 Tools
6.1.1 Rational TestManager
6.1.2 Rational Robot
6.1.3 Rational Clearcase
6.1.4 Rational Unified Process
6.2 Methods
6.2.1 Black Box Testing
6.2.2 Thread Testing
6.2.3 White Box Testing
6.2.4 Limited-Scale Integration Testing
6.2.5 Full-Scale Integration Testing
6.2.6 Test Plan
6.2.7 Test Execution and Issue Resolution
6.2.8 Test Documentation
6.3 Standards
6.3.1 IEEE 829, Standard for Software Test Documentation

7. Deployment
7.1 Tools
7.1.1 Symantec Norton Antivirus
7.1.2 The Shield Pro
7.1.3 Bullguard Internet Security
7.1.4 EZAntivirus
7.1.5 BitDefender 9
7.2 Methods
7.2.1 Beta Testing
7.2.2 Software Integration
7.2.3 Full Deployment
7.2.4 Baselines
7.3 Standards
7.3.1 IEEE 828, Standard for Software Configuration Management Plans
7.3.2 IEEE 1063, Standard for Software User Documentation

8. Maintenance
8.1 Tools
8.1.1 Parature, Inc. Help Desk Software
8.1.2 Laplink
8.2 Methods
8.2.1 Remote monitoring
8.2.2 Help Desk

99
8.2.3 Service Level Agreement (SLA)
8.2.4 Automatic updates
8.2.5 Error tracking
8.2.6 Baselines
8.2.7 Software Integration
8.2.8 Security Practices
8.2.9 Information Technology Infrastructure Library (ITIL)
8.3 Standards
8.3.1 IEEE 1219, Standard for Software Maintenance
8.3.2 IEEE 1042, Standard for Software Configuration Management
8.3.3 ISO 17799, Security Standard
8.3.4 BS 7799, Guidelines for Information Security Risk Management
8.3.5 ISO 27001, Information Security Management – Specification With
Guidance for Use

9. Disposal
9.1 Tools
9.1.1 Dell Computer Recycling Service
9.2 Methods
9.2.1 Certified Data Destruction
9.2.2 Hard Drive Erasure
9.2.3 Equipment Inventory, Inspection, and Testing
9.2.4 Comprehensive Reporting
9.3 Standards
9.3.1 Applicable State and Local Regulations
9.3.2 International Association of Electronics Recyclers
9.3.3 Electronic Industries Alliance (EIA) Reuse and Recycle

4.2 Software Documentation


The following table describes the documents that will be delivered in this project. Each
document will follow a standardized format and will be approved by essential personnel.
Separate parts of the software team will write the documents and a price will be assigned
by the amount of pages in the document. It is assumed that the Documentation Team will
be a part of every document’s production.

100
Name Format Written # of Time to Who Distribution Price
By Pages Initiate Approves
Project Proposal Corporation Marketing, 20 Before Project Senior Senior Corporate $3000
Format AWATS PM Initiates Corporate Mngmt.,
Mngmt. AWATS PM
System Design DI-MCCR- Systems 150 After SPMP Software AWATS PM, Package $22,500
Document 80534 Engineer, Quality PM’s, Package Design
(MIL-STD-498) Design Teams Control Teams
Software Project IEEE Std. AWATS PM, 300 Before Project Senior Entire Software Team $45,000
Management Plan 1058.1 - 1987 Package PM’s, Initiates Corporate
(SPMP) Assistant PM Mngmt.

Software IEEE Std 830- Requirements 230 After SPMP Subject AWATS PM, Package $34,500
Requirements 1984 Analyst, Matter Expert, PM’s, Technical Team
Specification Subject Matter Client PM, Leader, Chief
(SRS) Expert AWATS PM Programmer
Interface DI-MCCR- Tester 175 After SPMP Software Programmers $26,250
Requirements 80026 Quality
Document Control
Interface Design DI-MCCR- Chief 225 After SRS Software Programmers $33,750
Document 80027 Programmer Quality
Control
Software Design IEEE Std 1016- Design Team 350 After SRS Chief Technical Team Leader, $52,500
Document 1998 Programmer Programmers, Testers
Software Test DI-IPSC-81439 Test and 275 Before Software AWATS PM $55,000
Plan (MIL-STD-498) Evaluation Development Quality
Team is Initiated Control
Version NASA-DID- Configuration 45 After every Assistant PM Documentation Team $6750
Description M600 Mngmt. Team software build

101
Document
Software Test DI-IPSC-81440 Test and 125 After software Assistant PM AWATS PM, Package $25,000
Report (MIL-STD-498) Evaluation tests are PM’s
Team performed
and after
build.
System Operators DI-IPSC-81444 Manual Team 290 After Detailed Help Desk Help Desk Team, Client $174,000
Manual (MIL-STD-498) Design. Team
Software User’s IEEE Std 1063- Manual Team 180 After Detailed Client Help Desk Team, $108,000
Manual 1987 Design. Training Team
Software DI-IPSC-81447 Manual Team, 200 After Detailed Technical Help Desk Team $40,000
Programmer’s (MIL-STD-498) Chief Design. Team Leader
Manual Programmer
Total $626,250
Table 29 - Software Documentation

102
4.3 Project Support Functions

In order to fully implement the AWATS project under CMM Level 2 process, all phases
in the software development lifecycle should be monitored and controlled accordingly.
This requires great efforts not just from the development team but also needs the supports
from other organizations within the project. To clarify the roles and tasks within the
project in meeting the CMM-2 requirements, plans and SOPs should be developed to
guide project members, management, and the clients on how to focus on achieving the
project goals within budget and on schedule. Therefore, the PMP shall address the
following support functions that are vital to the project’s success: Configuration
Management, Quality Assurance, Verification and Validation, Test Evaluation, and
Information Systems Security. The success in achieving the goals will benefit the project
to improve to CMM Level 3 level.

4.3.1 Configuration Management

The level of effectiveness in using Configuration Management determines how well the
AWATS project’s changes in hardware, software, firmware, and documentations
(including requirements and designs) are being under control. This includes all software
components: ABC developed AWATS and reusable code, COTS, and outsourcing.

The CM staff will develop a CM plan that will outline all procedures in managing and
controlling all changes against project baseline. All source code and documentation will
be under version control and be tracked during the entire software system lifecycle. This
will ensure that the most recent changes will be included in the CM build.

The AWATS software as well as the AWATS product unit will be under systematical and
formal control. Therefore, the CM process also consists of a Change Control Board
(CCB) that makes decisions regarding whether or not proposed changes to the product
baseline should be implemented.

AWATS’ Configuration Management also facilitates government and CMM auditing.

4.3.2 Quality Assurance

Quality Assurance responsibilities are various from one corporation to another. Even
though QA can be used for validation and verification, the QA on the AWATS project
shall primarily be responsible for strictly enforcing procedures outlined in the CM plan as
well as requirements outlined by CMM-2. The QA’s success will reflect on any future
audits either by the USMC or CMM-3 certification.

The QA process on AWATS performs the following tasks:

 Enforce compliance with CMM-2 and USMC’s standards, procedure, and guidelines

103
 Review and sign off formal documentations and procedures (i.e.: Preliminary Design,
Detailed Design)
 Review and sign off on Software Development Folders (i.e.: code, unit test, CM logs)
 Review and sign off on Test Plans, Test Results, and Test Log
 Review and sign off AWATS user manuals and related documentation

The QA will independently audit the AWATS project and report directly to the Strategic
Business Unit’s QA organization. The QA’s signature is required in order formal or
deliverable be approved and delivered to the clients.

4.3.3 Verification and Validation

Verification and validation function is arguably is the most critical of all support
functions that are supporting the AWATS project. The V&V tasks are primarily
performed by the AWATS Test Team to ensure that the AWATS system is being built
against the right requirements and the right system is being built. However, all other
AWATS stakeholders, including the end-users during Beta Test, should aware of and
implement the SOP outlined for the AWATS V&V function. A software requirements
traceability matrix shall be developed to track the V&V efforts.

All requirements and plan documentations, source code, test cases, and SOPs should also
be following the V&V guidelines. The CM & QA staffs will assist with these efforts.

The success for the V&V function determines whether the end-users will be satisfying
with the AWATS functionalities and usefulness. Most importantly, the end-users have
the tools they need to achieve their daily tasks with ease.

4.3.4 Test and Evaluation

Test and evaluation tasks on the AWATS systems shall be shared by both the project
team and the end users, which will be U.S. Marines, before they are fully deployed across
the Service. During development, software engineers shall develop unit test cases against
the code and the Detail Design Documents. Before the end of the development cycle, an
Integration Test plan and a test schedule will be developed to guide the integration test
efforts. A System Test plan and schedule will also be created to cover both Alpha Test
and Beta Test efforts.

Configuration Management and Quality Assurance staffs will also be involved at this
time. The CM will verify whether all test plans and test cases are current and saved in
the CM Library as well as enforces version control. The QA will review test plans and
test results to ensure the correct procedures are followed in testing against the AWATS
requirements.

104
4.3.5 Information Systems Security

It is gravely critical to securely store and to accurately control a deadly weapon such as
the battlefield-model M-4 assault rifle. Therefore, the secured transmissions of the RFID
signals and the integrity of data encryptions between the AWATS system and the
scanners or interrogators are a high priority. This also applies to the AWATS
development and project environment as well as the lab in which the system is tested.

The U.S. Government's National Information Assurance Glossary defines that


Information Systems Security must be a layer of: “Protection of information systems
against unauthorized access to or modification of information, whether in storage,
processing or transit, and against the denial of service to authorized users or the
provision of service to unauthorized users, including those measures necessary to detect,
document, and counter such threats.”

Because all production units across the USMC will have to meet these policies, it’s vital
to develop a concrete and sound ISS implementation plan during the AWATS system
development lifecycle. The enforcement will be overseen and evaluated by an
Information Systems Security Officer (ISSO) on the AWATS project. Cooperating with
the Security team, Network Infrastructure team, the ISSO will be responsible for the
following activities:

 Ensuring that all security policies and procedures are followed


 Developing and maintaining all security documentation; completing
daily/weekly/monthly security checklists
 Develop policies and procedures to address any security weakness
 Investigate any suspected or confirmed security infractions and take corrective action
 Enforce policies and procedures on physical security on the AWATS systems,
scanners, integrators, RFIDs, and USMC weapons inside the lab armory

105
5.0 Work Packages, Resource Requirements & Estimations, Budget, and Schedule

5.1 Work Packages

To create the Work Breakdown Structure (WBS) for each package we used a sequence of methods. First, we distributed each
software package to the appropriate team lead(s). Each team lead(s) then further specified the work task items using their expertise,
experience, and insight. One specific technique was to use analogies to similar packages we have developed in the past. The project
manager then worked with each team lead(s) to insure that the project plan and the WBS leaf nodes were in sync and made sense for
the project as a whole.

106
1.0 AWATS
$40,808,696.00

1.1 Project 1.2 Technical 1.3 Quality 1.4 Configuration 1.5 Software
1.6 Hardware
Management Management Assurance Management Systems
$11,025,000.00
$347,000.00 $512,000.00 $1,352,000.00 $678,000.00 $23,606,400.00

1.9 Software 1.10 Post


1.8 NOTE: No Rollout
1.7 Kernel Support Deployment
Documentation Costs – Clients install
$3,000,000.00 Environment Software Support
$626,250.00 AWATS themselves.
$389,046.00 $2,162,000.00

1.5.3 Custom
1.5.1 COTS 1.5.2 Reuse 1.5.4 Outsource
Code
$6,373,728.00 $3,777,024.00 $472,128.00
$12,983,520.00

1.5.2.1 1.5.3.1 Electronic


1.5.1.1 Database 1.5.4.1 Graphics
Spreadsheet Inventory
$4,013,088.00 $472,128.00
$1,652,448.00 $7,554,048.00

1.5.2.2
1.5.1.2 Compile/ 1.5.3.2 Secure
Requirements
Link/Runtime Communications
Management
$1,416,384.00 $2,596,704.00
$944,256.00

1.5.1.3 Project 1.5.3.3 Word


1.5.2.3 GPS
Management Processing
$1,180,320.00
$944,256.00 $1,180,320.00

1.5.3.4
Debugging/
Testing
$1,652,448.00

Figure 28 - Work Breakdown Structure: AWATS all 11 packages

107
1.5.1.1 Database
$4,013,088.00

1.5.1.1.4
1.5.1.1.1 1.5.1.1.2 Vendor 1.5.1.1.3
Performance/
Requirements Selection Configuration
Load Testing
$802,617.60 $601,963.20 $2,006,544.00
$601,963.20

1.5.1.1.3.1 1.5.1.1.3.2 CRUD 1.5.1.1.3.3 1.5.1.1.3.4 ER 1.5.1.1.3.5


Interfaces Operations Security Diagram Hardware
$541,766.88 $401,308.80 $260,850.72 $501,636.00 $300,981.60

1.5.1.1.3.3.1
1.5.1.1.3.1.1 1.5.1.1.3.2.1 1.5.1.1.3.5.1
Establish Best
Module Interface Create Operations Memory
Practices
$200,654.40 $100,327.20 $100,327.20
$200,654.40

1.5.1.1.3.1.2 1.5.1.1.3.2.2 1.5.1.1.3.3.2


Hardware Retrieve Consultant 1.5.1.1.3.5.2 CPU
Interface Operations Examination $100,327.20
$40,130.88 $100,327.20 $60,196.32

1.5.1.1.3.2.3
1.5.1.1.3.1.3 User 1.5.1.1.3.5.3
Update
Interface Diskspace
Operations
$300,981.60 $100,327.20
$100,327.20

1.5.1.1.3.2.4
Delete Operations
$100,327.20

Figure 29 - Work Breakdown Structure: General Purpose Database Package (COTS)

108
1.5.1.2 Compile/
Link/Runtime
$1,416,384.00

1.5.1.2.1 1.5.1.2.2 Vendor 1.5.1.2.3


Requirements Selection Configuration
$100,000.00 $790,998.00 $525,386.00

1.5.1.2.3.2
1.5.1.2.3.1 1.5.1.2.3.3 1.5.1.2.3.4
Code & Unit
Interfaces Hardware Security
Testing
$105,105.00 $100,000.00 $50,000.00
$270,281.00

1.5.1.2.3.3.1
1.5.1.2.3.1.1 1.5.1.2.3.2.1
PC Environment
Module Interface Compile Code
Settings
$26,276.25 $67,570.25
$10,800.00

1.5.1.2.3.1.2
1.5.1.2.3.2.2 1.5.1.2.3.3.2
Hardware
Link Code PC & Memory
Interface
$67,570.25 $89,200.00
$26,276.25

1.5.1.2.3.1.3 User 1.5.1.2.3.2.3 Build


Interface Project
$26,276.25 $67,570.25

1.5.1.2.3.1.4
1.5.1.2.3.2.4 Run
CRMP CM
Project
Interface
$67,570.25
$26,276.25

Figure 30 - Work Breakdown Structure: Compile/Link/Runtime Package (COTS)

109
1.5.1.3 Project
Management
$944,256.00

1.5.1.3.1 1.5.1.3.2 Vendor 1.5.1.3.3


Requirements Selection Configuration
$100,000.00 $490,998.00 $353,258.00

1.5.1.3.3.1 1.5.1.3.3.2 1.5.1.3.3.3


Interfaces Analysis & Track Hardware
$105,105.00 $218,153.00 $30,000.00

1.5.1.3.3.2.1 1.5.1.3.3.3.1
1.5.1.3.3.1.1
Schedule PC Environment
Graphic Interface
Analysis Settings
$35,035.00
$72,717.66 $5,800.00

1.5.1.3.3.2.2
1.5.1.3.3.1.2 1.5.1.3.3.3.2
Tracking
User Interface PC & Memory
Progress
$35,035.00 $24,200.00
$72,717.66

1.5.1.3.3.1.3 1.5.1.3.3.2.3
Word Processing Deadline
Interface Management
$35,035.00 $72,717.66

Figure 31 - Work Breakdown Structure: Project Management Package (COTS)

110
1.5.2.1
Spreadsheet
$1,652,448.00

1.5.2.1.2
1.5.2.1.1 1.5.2.1.3 1.5.2.1.4 Unit 1.5.2.1.5
Component
Requirements Customization Testing Documentation
Selection
$462,685.44 $660,979.20 $264,391.68 $198,293.76
$66,097.92

1.5.2.1.3.1 1.5.2.1.3.2
1.5.2.1.3.3 1.5.2.1.3.4 RFID
Memory Database
Predictive Text Recognition
Management Translation
$105,756.67 $66,097.92
$118,976.26 $370,148.35

1.5.2.1.3.1.1 1.5.2.1.3.2.1 Data 1.5.2.1.3.3.1 Text 1.5.2.1.3.4.1 RFID


Memory Integrity Layer Database Pattern Database
$23,795.25 $55,522.25 $42,302.67 $29,744.06

1.5.2.1.3.3.1
1.5.2.1.3.1.2 Scan 1.5.2.1.3.2.2 1.5.2.1.3.4.1
Retrieval
Algorithm Business Layer Pattern Checker
Algorithm
$95,181.01 $92,537.09 $36,353.86
$63,454.00

1.5.2.1.3.2.3
Presentation
Layer
$148,059.34

1.5.2.1.3.2.3
Interface Layer
$74,029.67

Figure 32 - Work Breakdown Structure: Spreadsheet Package (Reuse)

111
1.5.2.2
Requirements
Management
$944,256.00

1.5.2.2.2
1.5.2.2.1 1.5.2.2.3 1.5.2.2.4 Unit 1.5.2.2.5
Component
Requirements Customization Testing Documentation
Selection
$188,851.20 $491,013.12 $122,753.28 $113,310.72
$28,327.68

1.5.2.2.3.3 1.5.2.2.3.4
1.5.2.2.3.1 RFID 1.5.2.2.3.2 EM
Electronic International
Standards Module Standards Module
Product Code Frequency Tables
$29,460.79 $39,281.05
(EPC) Module $54,011.44
$368,259.84

1.5.2.2.3.2.1 U.S. 1.5.2.2.3.4.1 ISO


1.5.2.2.3.1.1 1.5.2.2.3.3.1 EPC
EM Standards EM Spectrum
Generation 1 Data Web Service
Adherence Data Database
$15,319.61 Interface
$9,820.26 Interface
$368,259.84
$10,802.29
1.5.2.2.3.2.2 ISO
1.5.2.2.3.1.2
EM Standards
Generation 2 Data
Adherence Data 1.5.2.2.3.4.2
$14,141.18
$29,460.79 Interference
Probability Table
$16,203.43

1.5.2.2.3.4.2
Interference
Probability
Algorithm
$27,005.72

Figure 33 - Work Breakdown Structure: Requirement Management Package (Reuse)

112
1.5.2.3 GPS
$1,180,320.00

1.5.2.3.2
1.5.2.3.1 1.5.2.3.3 1.5.2.3.4 Unit 1.5.2.3.5
Component
Requirements Customization Testing Documentation
Selection
$365,899.20 $531,144.00 $141,638.40 $94,425.60
$472,128.00

1.5.2.3.3.1 GPS 1.5.2.3.3.2 1.5.2.3.3.3 User


Database Mapping Software Interface
$387,735.12 $95,605.92 $47,802.96

1.5.2.3.3.2.1
1.5.2.3.3.1.1 Data
Image Coordinate
Layer
Database
$58,160.27
$76,484.74

1.5.2.3.3.2.2
1.5.2.3.3.1.2
Coordinate
Business Layer
Algorithm
$174,480.80
$19,121.18

1.5.2.3.3.1.3
Presentation
Layer
$155,094.05

Figure 34 - Work Breakdown Structure: GPS Navigation Package (Reuse)

113
1.5.3.1 Electronic
Inventory
$7,554,048.00

1.5.3.1.1 1.5.3.1.3 1.5.3.1.4 Unit 1.5.3.1.5


1.5.3.1.2 Design
Requirements Construction Testing Documentation
$3,021,619.20
$1,133,107.20 $1,888,512.00 $1,133,107.20 $377,702.40

1.5.3.1.1.2 Write 1.5.3.1.2.2 1.5.3.1.2.4


1.5.3.1.1.1 1.5.3.1.2.1 RFID 1.5.3.1.2.3 Armory
Requirements Security Database
Interview Users Interface Interface
Document Communications Interface
$396,587.52 $362,594.30 $1,510,809.60
$736,519.68 Layer Interface $604,323.84
$543,891.46

1.5.3.1.2.3.1
1.5.3.1.2.2.1 Send
Assign Weapon
Module
Module
$271,945.73
$483,459.70

1.5.3.1.2.3.2
1.5.3.1.2.2.2
Authentication
Receive Module
Module
$271,945.73
$271,945.73

1.5.3.1.2.3.3
Authorization
Module
$211,513.34

1.5.3.1.2.3.4
Receive Weapon
Module
$543,891.46

Figure 35 - Work Breakdown Structure: Electronic Inventory & Tracking Package (Custom)

114
1.5.3.2 Secure
Communications
$2,596,704.00

1.5.3.2.1 1.5.3.2.2 1.5.3.2.5


1.5.3.2.3 Detailed 1.5.3.2.4 Code 1.5.3.2.6 System
Requirements Preliminary Integration
Design and Unit Testing Test
Specifications Design Testing
$1,817,692.80 $109,061.57 $109,061.57
$140,222.01 $311,604.48 $109,061.57

1.5.3.2.3.1
1.5.3.2.3.2 Server/
Hardware
Client Link
Interfaces
$1,090,615.68
$727,077.12

1.5.3.2.3.1.1 Client 1.5.3.2.3.2.1


Interface Security
$363,538.56 $1,090,615.68

1.5.3.2.3.1.2
Server Interface 1.5.3.2.3.2.1.1 SSL
$363,538.56 for Server
$545,307.84

1.5.3.2.3.2.1.2
SSL for Client
$545,307.84

Figure 36 - Work Breakdown Structure: Communications Package (Custom)

115
1.5.3.3 Word
Processing
$1,180,320.00

1.5.3.3.1 1.5.3.3.2 1.5.3.3.5


1.5.3.3.3 Detailed 1.5.3.3.4 Code 1.5.3.3.6 System
Requirements Preliminary Integration
Design and Unit Testing Test
Specifications Design Testing
$826,224.00 $49,573.44 $49,573.44
$63,737.28 $141,638.40 $49,573.44

1.5.3.3.3.1 1.5.3.3.3.2
Reporting Software
$420,567.00 $405,657.00

1.5.3.3.3.1.1 User 1.5.3.3.3.2.1 User


Interface Interface
$120,488.00 $135,219.00

1.5.3.3.3.1.2 Link/
1.5.3.3.3.2.2
Interface to
Server Interface
Graphics Package
$135,219.00
$300,079.00

1.5.3.3.3.2.3
Module Interface
$135,219.00

Figure 37 - Work Breakdown Structure: Word Processing Package (Custom)

116
1.5.3.4
Debugging/
Testing
$1,652,448.00

1.5.3.4.1 1.5.3.4.2 1.5.3.4.5


1.5.3.4.3 Detailed 1.5.3.4.4 Code 1.5.3.4.6 System
Requirements Preliminary Integration
Design and Unit Testing Test
Specifications Design Testing
$1,156,713.60 $69,402.82 $69,402.82
$89,232.18 $198,293.76 $69,402.82

1.5.3.4.3.2 1.5.3.4.3.3
1.5.3.4.3.1 Testing
Hardware Software Test
Interface
Interface Suite
$277,611.26
$416,416.90 $462,685.44

1.5.3.4.3.3.1
1.5.3.4.3.1.1 User 1.5.3.4.3.2.1 User
Requisition Test
Interface Interface
Logic
$185,074.17 $83,283.38
$92,537.09

1.5.3.4.3.3.2
1.5.3.4.3.1.2 1.5.3.4.3.2.2
Integration and
Module Interface Module Interface
System Test
$92,537.09 $83,283.38
$92,537.09

1.5.3.4.3.2.3 1.5.3.4.3.3.3
Server Interface Security Test
$83,283.38 $92,537.09

1.5.3.4.3.2.4 RFID 1.5.3.4.3.3.4


Interface Failover Test
$166,566.76 $92,537.09

1.5.3.4.3.3.5
1.5.3.4.3.2.4.1 1.5.3.4.3.2.4.2 Performance Test
1.5.3.4.3.2.4.3 Logic
Virtual Barcode RFID Virtual
RFID Virtual Tags $92,537.09
Scanner Interrogator
$55,522.25
$55,522.25 $55,522.25
Figure 38 - Work Breakdown Structure: Debugging/Testing Packages (Custom)

117
1.5.4.1 Graphics
$472,128.00

1.5.4.1.2
1.5.4.1.1 1.5.4.1.3 Detailed 1.5.4.1.4 Code
Preliminary
Requirements Design and Unit Test
Design
$129,835.20 $141,638.40 $118,032.00
$82,622.40

1.5.4.1.1.1 1.5.4.1.2.1 Divide 1.5.4.1.3.1 Write


Interview Users into Components Use Cases
$94,425.60 $11,803.20 $70,819.20

1.5.4.1.1.2 Write
1.5.4.1.2.2 Create 1.5.4.1.3.2 Create
Requirements
Prototype UML Diagrams
Document
$70,819.20 $70,819.20
$35,409.60

Figure 39 – Work Breakdown Structure: Graphical Presentation Packages (Outsourced)

118
Table 30 - Software Documentation
ID#   Task Name
1 1 AWATS
2 1.1 Project Management
3 1.2 Technical Management
4 1.3 Quality Assurance
5 1.4 Configuration Management
6 1.5 Software Systems
7 1.5.1 COTS
8 1.5.1.1 Database
9 1.5.1.1.1 Requirements
10 1.5.1.1.2 Vendor Selection
11 1.5.1.1.3 Configuration
12 1.5.1.1.3.1 Interfaces
13 1.5.1.1.3.1.1 Module Interface
14 1.5.1.1.3.1.2 Hardware Interface
15 1.5.1.1.3.1.3 User Interface
16 1.5.1.1.3.2 CRUD Operations
17 1.5.1.1.3.2.1 Create Operations
18 1.5.1.1.3.2.2 Retrieve Operations
19 1.5.1.1.3.2.3 Update Operations
20 1.5.1.1.3.2.4 Delete Operations
21 1.5.1.1.3.3 Security
22 1.5.1.1.3.3.1 Establish Best Practices
23 1.5.1.1.3.3.2 Consultant Examination
24 1.5.1.1.3.4 ER Diagram
25 1.5.1.1.3.5 Hardware
26 1.5.1.1.3.5.1 Memory
27 1.5.1.1.3.5.2 CPU
28 1.5.1.1.3.5.3 Diskspace
29 1.5.1.1.4 Performance/Load Testing
30 1.5.1.2 Compile/Link/Runtime
31 1.5.1.2.1 Requirements
32 1.5.1.2.2 Vendor Selection
33 1.5.1.2.3 Configuration
34 1.5.1.2.3.1 Interfaces
35 1.5.1.2.3.1.1 Module Interface

119
36 1.5.1.2.3.1.2 Hardware Interface
37 1.5.1.2.3.1.3 User Interface
38 1.5.1.2.3.1.4 CRMP CM Interface
39 1.5.1.2.3.2 Code & Unit Testing
40 1.5.1.2.3.2.1 Compile Code
41 1.5.1.2.3.2.2 Link Code
42 1.5.1.2.3.2.3 Build Project
43 1.5.1.2.3.2.4 Run Project
44 1.5.1.2.3.3 Hardware
45 1.5.1.2.3.3.1 PC Environment Settings
46 1.5.1.2.3.3.2 PC & Memory
47 1.5.1.2.3.4 Security
48 1.5.1.3 Project Management
49 1.5.1.3.1 Requirements
50 1.5.1.3.2 Vendor Selection
51 1.5.1.3.3 Configuration
52 1.5.1.3.3.1 Interfaces
53 1.5.1.3.3.1.1 Graphic Interface
54 1.5.1.3.3.1.2 User Interface
55 1.5.1.3.3.1.3 Word Processing Interface
56 1.5.1.3.3.2 Analysis & Track
57 1.5.1.3.3.2.1 Schedule Analysis
58 1.5.1.3.3.2.2 Tracking Progress
59 1.5.1.3.3.2.3 Deadline Management
60 1.5.1.3.3.3 Hardware
61 1.5.1.3.3.3.1 PC Environment Settings
62 1.5.1.3.3.3.2 PC & Memory
63 1.5.2 REUSE
64 1.5.2.1 Spreadsheet
65 1.5.2.1.1 Requirements
66 1.5.2.1.2 Component Selection
67 1.5.2.1.3 Customization
68 1.5.2.1.3.1 Memory Management
69 1.5.2.1.3.1.1 Memory Integrity
70 1.5.2.1.3.1.2 Scan Algorithm
71 1.5.2.1.3.2 Database Translation
72 1.5.2.1.3.2.1 Data Layer

120
73 1.5.2.1.3.2.2 Business Layer
74 1.5.2.1.3.2.3 Presentation Layer
75 1.5.2.1.3.2.4 Interface Layer
76 1.5.2.1.3.3 Predictive Text
77 1.5.2.1.3.3.1 Text Database
78 1.5.2.1.3.3.1 Retrieval Algorithm
79 1.5.2.1.3.4 RFID Recognition
80 1.5.2.1.3.4.1 RFID Pattern Database
81 1.5.2.1.3.4.2 Pattern Checker
82 1.5.2.1.4 Unit Testing
83 1.5.2.1.5 Documentation
84 1.5.2.2 Requirements Management
85 1.5.2.2.1 Requirements
86 1.5.2.2.2 Component Selection
87 1.5.2.2.3 Customization
88 1.5.2.2.3.1 RFID Standards Module
89 1.5.2.2.3.1.1 Generation 1 Data
90 1.5.2.2.3.1.2 Generation 2 Data
91 1.5.2.2.3.2 EM Standards Module
92 1.5.2.2.3.2.1 U.S. EM Standards Adherence Data
93 1.5.2.2.3.2.2 ISO EM Standards Adherence Data
94 1.5.2.2.3.3 Electronic Product Code (EPC) Module
95 1.5.2.2.3.3.1 EPC Web Service Interface
96 1.5.2.2.3.4 International Frequency Table
97 1.5.2.2.3.4.1 ISO EM Spectrum Database Interface
98 1.5.2.2.3.4.2 Interference Probability Table
99 1.5.2.2.3.4.3 Interference Probability Algorithm
100 1.5.2.2.4 Unit Testing
101 1.5.2.2.5 Documentation
102 1.5.2.3 GPS
103 1.5.2.3.1 Requirements
104 1.5.2.3.2 Component Selection
105 1.5.2.3.3 Customization
106 1.5.2.3.3.1 GPS Database
107 1.5.2.3.3.1.1 Data Layer
108 1.5.2.3.3.1.2 Business Layer
109 1.5.2.3.3.1.3 Presentation Layer

121
110 1.5.2.3.3.2 Mapping Software
111 1.5.2.3.3.2.1 Image Coordinate Database
112 1.5.2.3.3.2.2 Coordinate Algorithm
113 1.5.2.3.3.3 User Interface
114 1.5.2.3.4 Unit Testing
115 1.5.2.3.5 Documentation
116 1.5.3 CUSTOM CODE
117 1.5.3.1 Electronic Inventory
118 1.5.3.1.1 Requirements
119 1.5.3.1.1.1 Interview Users
120 1.5.3.1.1.2 Write Requirements Document
121 1.5.3.1.2 Design
122 1.5.3.1.2.1 RFID Interface
123 1.5.3.1.2.2 Security Communications Layer Interface
124 1.5.3.1.2.2.1 Send Module
125 1.5.3.1.2.2.2 Receive Module
126 1.5.3.1.2.3 Armory Interface
127 1.5.3.1.2.3.1 Assign Weapon Module
128 1.5.3.1.2.3.2 Authentication Module
129 1.5.3.1.2.3.3 Authorization Module
130 1.5.3.1.2.3.4 Receive Weapon Module
131 1.5.3.1.2.4 Database Interface
132 1.5.3.1.3 Construction
133 1.5.3.1.4 Unit Testing
134 1.5.3.1.5 Documentation
135 1.5.3.2 Secure Communications
136 1.5.3.2.1 Requirements Specification
137 1.5.3.2.2 Preliminary Design
138 1.5.3.2.3 Detailed Design
139 1.5.3.2.3.1 Hardware Interfaces
140 1.5.3.2.3.1.1 Client Interface
141 1.5.3.2.3.1.2 Server Interface
142 1.5.3.2.3.2 Server/Client Link
143 1.5.3.2.3.2.1 Security
144 1.5.3.2.3.2.1.1 SSL for Server
145 1.5.3.2.3.2.1.2 SSL for Client
146 1.5.3.2.4 Code and Unit Testing

122
147 1.5.3.2.5 Integration Testing
148 1.5.3.2.6 System Test
149 1.5.3.3 Word Processing
150 1.5.3.3.1 Requirements Specification
151 1.5.3.3.2 Preliminary Design
152 1.5.3.3.3 Detailed Design
153 1.5.3.3.3.1 Reporting
154 1.5.3.3.3.1.1 User Interface
155 1.5.3.3.3.1.2 Interface to Graphics Package
156 1.5.3.3.3.2 Software
157 1.5.3.3.3.2.1 User Interface
158 1.5.3.3.3.2.2 Server Interface
159 1.5.3.3.3.2.3 Module Interface
160 1.5.3.3.4 Code and Unit Testing
161 1.5.3.3.5 Integration Testing
162 1.5.3.3.6 System Test
163 1.5.3.4 Debugging/Testing
164 1.5.3.4.1 Requirements Specification
165 1.5.3.4.2 Preliminary Design
166 1.5.3.4.3 Detailed Design
167 1.5.3.4.3.1 Testing Interface
168 1.5.3.4.3.1.1 User Interface
169 1.5.3.4.3.1.2 Module Interface
170 1.5.3.4.3.2 Hardware Interface
171 1.5.3.4.3.2.1 User Interface
172 1.5.3.4.3.2.2 Module Interface
173 1.5.3.4.3.2.3 Server Interface
174 1.5.3.4.3.2.4 RFID Interface
175 1.5.3.4.3.2.4.1 Virtual Barcode Scanner
176 1.5.3.4.3.2.4.2 RFID Virtual Interrogator
177 1.5.3.4.3.2.4.3 RFID Virtual Tags
178 1.5.3.4.3.3 Software Test Suite
179 1.5.3.4.3.3.1 Requisition Test Logic
180 1.5.3.4.3.3.2 Integration and System Test
181 1.5.3.4.3.3.3 Security Test
182 1.5.3.4.3.3.4 Failover Test
183 1.5.3.4.3.3.5 Performance Test Logic

123
184 1.5.3.4.4 Code and Unit Testing
185 1.5.3.4.5 Integration Testing
186 1.5.3.4.6 System Test
187 1.5.4 OUTSOURCE
188 1.5.4.1 Graphics
189 1.5.4.1.1 Requirements
190 1.5.4.1.1.1 Interview Users
191 1.5.4.1.1.2 Write Requirements Document
192 1.5.4.1.2 Preliminary Design
193 1.5.4.1.2.1 Divide into Components
194 1.5.4.1.2.2 Create Prototype
195 1.5.4.1.3 Detailed Design
196 1.5.4.1.3.1 Write Use Cases
197 1.5.4.1.3.2 Create UML Diagrams
198 1.5.4.1.4 Code and Unit Test
199 1.6 Hardware
200 1.7 Kernel
201 1.8 Documentation
202 1.9 Software Support Environment
203 1.10 Post Deployment Software Support
Table 30 - Software Documentation

124
5.1.1 Work Packages Specifications Example*

Work Package Specification: DATABASE-DD-001

Activity Number: 1.5.1.2.3.1.3


Activity Name: DATABASE-DD-001
Feature Description: Configure a graphical interface to create, retrieve, update, and delete RFIG tag, weapon and Marines data in the
AWATS database.
Activity Description: Configure the user interfaces on both the user and the database sides allowing accurate database handling and
user-friendly.
Estimated Duration: 2 Weeks
Resources Needed:
Personnel: 1 Requirements Analyst, 1 Systems Engineer/Analyst, 1 Database Administrator, 1 Programmer, 1 RFID Engineer
Skills: User interface configuration knowledge, requirements knowledge, database setting-up and tuning knowledge
Tools: Microsoft SQL Server 2005, User Interface Builder
Travel: None
Work Product: Implementation of a database user interface configuration to link up the COTS database to
Risks: RFID complications
Predecessors: Preliminary design of client interface (done in parallel)
Completion Criteria: Approval of a set of configurations that will be used on the AWATS system to interface between the database
and the users.

IMPLEMENTATION

Personnel Assigned:
Starting Date:
Completion Date:
Costs (Budgeted/Total): $300,981.60/$902,944.80
Legacy:
Comments:

* Please see Appendix J for the complete list of Work Packages Specifications

125
5.2 Dependencies

Figure 40 – Dependencies (Part I)

126
Figure 41 – Dependencies (Part II)

127
Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

PROJECT PHASE & ACTIVITY INFORMATION


========================================

Overall Phase Distribution


==============================================================================
PROJECT AWATS
SLOC 1116000
TOTAL EFFORT 8000.606 Person Months
==============================================================================
PCNT EFFORT (PM) PCNT SCHEDULE Staff
Plans And Requirements 7.000 560.042 22.406 14.959 37.439
Product Design 17.000 1360.103 27.203 18.162 74.889
Programming 54.392 4351.660 43.189 28.835 150.918
- Detailed Design 23.797 1903.921 ---- ---- ----
- Code and Unit Test 30.594 2447.739 ---- ---- ----
Integration and Test 28.608 2288.843 29.608 19.768 115.787
==============================================================================
Life Cycle Phase Plans And Requirements
Life Cycle Effort 560.042 Person Months
Life Cycle Schedule 14.959 Months
==============================================================================
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 44.797 250.883 14.959 16.772
Product Design 17.601 98.575 14.959 6.590
Programming 5.703 31.938 14.959 2.135
Test Planning 4.101 22.970 14.959 1.536
Verification and Validation 7.601 42.571 14.959 2.846
Project Office 12.297 68.870 14.959 4.604
CM/QA 2.899 16.233 14.959 1.085
Manuals 5.000 28.002 14.959 1.872

* Please see the Appendix K for the complete list of COCOMOII outputs.
128
5.3 Resource Requirements

Initial Size Estimates (KEDSI) Computed Final Size (KEDSI) Computed Effort (Staff Months)

Method 1

Method 2

Method 3

Method 1

Method 2

Method 3
Application Development Computed Walston-
COCOMO Boyston
Class EV Size Felix

General Purpose Database Package 80 120 180 1 16 24 36 24.7 168.6 96.1 82.5
Spreadsheet Package 50 65 73 2 10 13 14.6 12.8 76.5 52.8 45.5
Configuration & Requirements Management
62 78 89 2 12.4 15.6 17.8 15.4 96.0 62.7 53.8
Package
Communication Package 43 62 70 3 8.6 12.4 14 12.0 71.2 50.0 43.2
Graphics Presentation Package 80 110 124 3 16 22 24.8 21.5 142.7 84.7 72.5
Word Processing Package 50 65 80 3 10 13 16 13.0 78.2 53.7 46.2
Project Manager’s Package 45 56 70 1 9 11.2 14 11.3 66.1 47.2 40.9
GPS Navigation Package 69 74 82 2 13.8 14.8 16.4 14.9 92.1 60.8 52.1
Compile/Link/Runtime Packages for MS Visual
43 56 69 1 8.6 11.2 13.8 11.2 65.4 46.9 40.6
Studio
Language Independent Debugging & Testing
52 65 80 3 10.4 13 16 13.1 78.7 53.9 46.4
Package
Electronic Inventory & Tracking Package 345 365 380 3 69 73 76 72.8 618.2 257.5 232.3
Table 31 – Resource Requirements (Part I)
* See Notes Below

129
Computed Duration (months)
Computed Adjusted
Schedule Compressed Effort
Computed Expected Expected Effort,
Walston- Compression Duration Adjustment
Application Effort (Staff Months) COCOMO Boyston Duration AE (Staff
Felix Value (%) (Months) Factor, EAF
(Months) Months)

General Purpose Database


105.9 11.1 22.0 10.0 12.7 0 12.7 1 105.9
Package
Spreadsheet Package 55.5 9.0 17.4 8.1 10.3 0 10.3 1 55.5
Configuration &
Requirements Management 66.8 9.6 18.6 8.6 10.9 0 10.9 1 66.8
Package
Communication Package 52.4 8.9 17.1 7.9 10.1 0 10.1 1 52.4
Graphics Presentation Package 92.3 10.6 20.9 9.6 12.2 0 12.2 1 92.3
Word Processing Package 56.5 9.1 17.5 8.1 10.3 0 10.3 1 56.5
Project Manager’s Package 49.3 8.7 16.7 7.8 9.9 0 9.9 1 49.3
GPS Navigation Package 64.5 9.5 18.4 8.5 10.8 0 10.8 1 64.5
Compile/Link/Runtime
Packages for MS Visual 48.9 8.7 16.6 7.8 9.9 0 9.9 1 48.9
Studio
Language Independent
56.8 9.1 17.6 8.2 10.4 0 10.4 1 56.8
Debugging & Testing Package
Electronic Inventory &
313.4 15.7 32.5 14.3 18.3 2 17.9 1.02 319.7
Tracking Package
Table 32 – Resource Requirements (Part II)

130
Productivity
EDSI/Staff Average Staff
Application
Months

General Purpose Database Package 232.9 8.3


Spreadsheet Package 229.9 5.4
Configuration & Requirements Management Package 231.1 6.1
Communication Package 229.5 5.2
Graphics Presentation Package 232.5 7.6
Word Processing Package 230.1 5.5
Project Manager’s Package 229.1 5.0
GPS Navigation Package 230.9 6.0
Compile/Link/Runtime Packages for MS Visual Studio 229.0 5.0
Language Independent Debugging & Testing Package 230.1 5.5
Electronic Inventory & Tracking Package 227.9 17.8
Table 33 – Resource Requirements (Part III)

*NOTES:

KEDSI is thousands of estimated delivered source instructions

DEVELOPMENT CLASS: 1 = COTS 2 = Reuse 3 = Custom

Method 1 is analogy using ranges from Table 1: Software Productivity by Application Domains at
http://www.stsc.hill.af.mil/crosstalk/2002/03/reifer.html
Method 2 is division of application storage allocation by 4 to obtain estimate
Method 3 is modified Delphi using expert team members

131
5.4 Budget and Schedule Allocation
Table 34 – Budget & Schedule Allocation
ID#   Task Name Fixed Cost Total Costs
1 1 AWATS $40,808,696.00 $122,426,088.00
2 1.1 Project Management $347,000.00 $1,041,000.00
3 1.2 Technical Management $512,000.00 $1,536,000.00
4 1.3 Quality Assurance $1,352,000.00 $4,056,000.00
5 1.4 Configuration Management $678,000.00 $2,034,000.00
6 1.5 Software Systems $23,606,400.00 $70,819,200.00
7 1.5.1 COTS $6,373,728.00 $19,121,184.00
8 1.5.1.1 Database $4,013,088.00 $12,039,264.00
9 1.5.1.1.1 Requirements $802,617.60 $2,407,852.80
10 1.5.1.1.2 Vendor Selection $601,963.20 $1,805,889.60
11 1.5.1.1.3 Configuration $2,006,544.00 $6,019,632.00
12 1.5.1.1.3.1 Interfaces $541,766.88 $1,625,300.64
13 1.5.1.1.3.1.1 Module Interface $200,654.40 $601,963.20
14 1.5.1.1.3.1.2 Hardware Interface $40,130.88 $120,392.64
15 1.5.1.1.3.1.3 User Interface $300,981.60 $902,944.80
16 1.5.1.1.3.2 CRUD Operations $401,308.80 $1,203,926.40
17 1.5.1.1.3.2.1 Create Operations $100,327.20 $300,981.60
18 1.5.1.1.3.2.2 Retrieve Operations $100,327.20 $300,981.60
19 1.5.1.1.3.2.3 Update Operations $100,327.20 $300,981.60
20 1.5.1.1.3.2.4 Delete Operations $100,327.20 $300,981.60
21 1.5.1.1.3.3 Security $260,850.72 $782,552.16
22 1.5.1.1.3.3.1 Establish Best Practices $200,654.40 $601,963.20
23 1.5.1.1.3.3.2 Consultant Examination $60,196.32 $180,588.96
24 1.5.1.1.3.4 ER Diagram $501,636.00 $1,504,908.00
25 1.5.1.1.3.5 Hardware $300,981.60 $902,944.80
26 1.5.1.1.3.5.1 Memory $100,327.20 $300,981.60
27 1.5.1.1.3.5.2 CPU $100,327.20 $300,981.60
28 1.5.1.1.3.5.3 Diskspace $100,327.20 $300,981.60
29 1.5.1.1.4 Performance/Load Testing $601,963.20 $1,805,889.60
30 1.5.1.2 Compile/Link/Runtime $1,416,384.00 $4,249,152.00
31 1.5.1.2.1 Requirements $100,000.00 $300,000.00
32 1.5.1.2.2 Vendor Selection $790,998.00 $2,372,994.00
33 1.5.1.2.3 Configuration $525,386.00 $1,576,158.00

132
34 1.5.1.2.3.1 Interfaces $105,105.00 $315,315.00
35 1.5.1.2.3.1.1 Module Interface $26,276.25 $78,828.75
36 1.5.1.2.3.1.2 Hardware Interface $26,276.25 $78,828.75
37 1.5.1.2.3.1.3 User Interface $26,276.25 $78,828.75
38 1.5.1.2.3.1.4 CRMP CM Interface $26,276.25 $78,828.75
39 1.5.1.2.3.2 Code & Unit Testing $270,281.00 $810,843.00
40 1.5.1.2.3.2.1 Compile Code $67,570.25 $202,710.75
41 1.5.1.2.3.2.2 Link Code $67,570.25 $202,710.75
42 1.5.1.2.3.2.3 Build Project $67,570.25 $202,710.75
43 1.5.1.2.3.2.4 Run Project $67,570.25 $202,710.75
44 1.5.1.2.3.3 Hardware $100,000.00 $300,000.00
45 1.5.1.2.3.3.1 PC Environment Settings $10,800.00 $32,400.00
46 1.5.1.2.3.3.2 PC & Memory $89,200.00 $267,600.00
47 1.5.1.2.3.4 Security $50,000.00 $150,000.00
48 1.5.1.3 Project Management $944,256.00 $2,832,768.00
49 1.5.1.3.1 Requirements $100,000.00 $300,000.00
50 1.5.1.3.2 Vendor Selection $490,998.00 $1,472,994.00
51 1.5.1.3.3 Configuration $353,258.00 $1,059,774.00
52 1.5.1.3.3.1 Interfaces $105,105.00 $315,315.00
53 1.5.1.3.3.1.1 Graphic Interface $35,035.00 $105,105.00
54 1.5.1.3.3.1.2 User Interface $35,035.00 $105,105.00
55 1.5.1.3.3.1.3 Word Processing Interface $35,035.00 $105,105.00
56 1.5.1.3.3.2 Analysis & Track $218,153.00 $654,459.00
57 1.5.1.3.3.2.1 Schedule Analysis $72,717.66 $218,152.98
58 1.5.1.3.3.2.2 Tracking Progress $72,717.66 $218,152.98
59 1.5.1.3.3.2.3 Deadline Management $72,717.66 $218,152.98
60 1.5.1.3.3.3 Hardware $30,000.00 $90,000.00
61 1.5.1.3.3.3.1 PC Environment Settings $5,800.00 $17,400.00
62 1.5.1.3.3.3.2 PC & Memory $24,200.00 $72,600.00
63 1.5.2 REUSE $3,777,024.00 $11,331,072.00
64 1.5.2.1 Spreadsheet $1,652,448.00 $4,957,344.00
65 1.5.2.1.1 Requirements $462,685.44 $1,388,056.32
66 1.5.2.1.2 Component Selection $66,097.92 $198,293.76
67 1.5.2.1.3 Customization $660,979.20 $1,982,937.60
68 1.5.2.1.3.1 Memory Management $118,976.26 $356,928.78
69 1.5.2.1.3.1.1 Memory Integrity $23,795.25 $71,385.75
70 1.5.2.1.3.1.2 Scan Algorithm $95,181.01 $285,543.03

133
71 1.5.2.1.3.2 Database Translation $370,148.35 $1,110,445.05
72 1.5.2.1.3.2.1 Data Layer $55,522.25 $166,566.75
73 1.5.2.1.3.2.2 Business Layer $92,537.09 $277,611.27
74 1.5.2.1.3.2.3 Presentation Layer $148,059.34 $444,178.02
75 1.5.2.1.3.2.4 Interface Layer $74,029.67 $222,089.01
76 1.5.2.1.3.3 Predictive Text $105,756.67 $317,270.01
77 1.5.2.1.3.3.1 Text Database $42,302.67 $126,908.01
78 1.5.2.1.3.3.1 Retrieval Algorithm $63,454.00 $190,362.00
79 1.5.2.1.3.4 RFID Recognition $66,097.92 $198,293.76
80 1.5.2.1.3.4.1 RFID Pattern Database $29,744.06 $89,232.18
81 1.5.2.1.3.4.2 Pattern Checker $36,353.86 $109,061.58
82 1.5.2.1.4 Unit Testing $264,391.68 $793,175.04
83 1.5.2.1.5 Documentation $198,293.76 $594,881.28
84 1.5.2.2 Requirements Management $944,256.00 $2,832,768.00
85 1.5.2.2.1 Requirements $188,851.20 $566,553.60
86 1.5.2.2.2 Component Selection $28,327.68 $84,983.04
87 1.5.2.2.3 Customization $491,013.12 $1,473,039.36
88 1.5.2.2.3.1 RFID Standards Module $29,460.79 $88,382.37
89 1.5.2.2.3.1.1 Generation 1 Data $15,319.61 $45,958.83
90 1.5.2.2.3.1.2 Generation 2 Data $14,141.18 $42,423.54
91 1.5.2.2.3.2 EM Standards Module $39,281.05 $117,843.15
92 1.5.2.2.3.2.1 U.S. EM Standards Adherence Data $9,820.26 $29,460.78
93 1.5.2.2.3.2.2 ISO EM Standards Adherence Data $29,460.79 $88,382.37
94 1.5.2.2.3.3 Electronic Product Code (EPC) Module $368,259.84 $1,104,779.52
95 1.5.2.2.3.3.1 EPC Web Service Interface $368,259.84 $1,104,779.52
96 1.5.2.2.3.4 International Frequency Table $54,011.44 $162,034.32
97 1.5.2.2.3.4.1 ISO EM Spectrum Database Interface $10,802.29 $32,406.87
98 1.5.2.2.3.4.2 Interference Probability Table $16,203.43 $48,610.29
99 1.5.2.2.3.4.3 Interference Probability Algorithm $27,005.72 $81,017.16
100 1.5.2.2.4 Unit Testing $122,753.28 $368,259.84
101 1.5.2.2.5 Documentation $113,310.72 $339,932.16
102 1.5.2.3 GPS $1,180,320.00 $3,540,960.00
103 1.5.2.3.1 Requirements $365,899.20 $1,097,697.60
104 1.5.2.3.2 Component Selection $472,128.00 $1,416,384.00
105 1.5.2.3.3 Customization $531,144.00 $1,593,432.00
106 1.5.2.3.3.1 GPS Database $387,735.12 $1,163,205.36
107 1.5.2.3.3.1.1 Data Layer $58,160.27 $174,480.81

134
108 1.5.2.3.3.1.2 Business Layer $174,480.80 $523,442.40
109 1.5.2.3.3.1.3 Presentation Layer $155,094.05 $465,282.15
110 1.5.2.3.3.2 Mapping Software $95,605.92 $286,817.76
111 1.5.2.3.3.2.1 Image Coordinate Database $76,484.74 $229,454.22
112 1.5.2.3.3.2.2 Coordinate Algorithm $19,121.18 $57,363.54
113 1.5.2.3.3.3 User Interface $47,802.96 $143,408.88
114 1.5.2.3.4 Unit Testing $141,638.40 $424,915.20
115 1.5.2.3.5 Documentation $94,425.60 $283,276.80
116 1.5.3 CUSTOM CODE $12,983,520.00 $38,950,560.00
117 1.5.3.1 Electronic Inventory $7,554,048.00 $22,662,144.00
118 1.5.3.1.1 Requirements $1,133,107.20 $3,399,321.60
119 1.5.3.1.1.1 Interview Users $396,587.52 $1,189,762.56
120 1.5.3.1.1.2 Write Requirements Document $736,519.68 $2,209,559.04
121 1.5.3.1.2 Design $3,021,619.20 $9,064,857.60
122 1.5.3.1.2.1 RFID Interface $362,594.30 $1,087,782.90
123 1.5.3.1.2.2 Security Communications Layer Interface $543,891.46 $1,631,674.38
124 1.5.3.1.2.2.1 Send Module $271,945.73 $815,837.19
125 1.5.3.1.2.2.2 Receive Module $271,945.73 $815,837.19
126 1.5.3.1.2.3 Armory Interface $1,510,809.60 $4,532,428.80
127 1.5.3.1.2.3.1 Assign Weapon Module $483,459.70 $1,450,379.10
128 1.5.3.1.2.3.2 Authentication Module $271,945.73 $815,837.19
129 1.5.3.1.2.3.3 Authorization Module $211,513.34 $634,540.02
130 1.5.3.1.2.3.4 Receive Weapon Module $543,891.46 $1,631,674.38
131 1.5.3.1.2.4 Database Interface $604,323.84 $1,812,971.52
132 1.5.3.1.3 Construction $1,888,512.00 $5,665,536.00
133 1.5.3.1.4 Unit Testing $1,133,107.20 $3,399,321.60
134 1.5.3.1.5 Documentation $377,702.40 $1,133,107.20
135 1.5.3.2 Secure Communications $2,596,704.00 $7,790,112.00
136 1.5.3.2.1 Requirements Specification $140,222.01 $420,666.03
137 1.5.3.2.2 Preliminary Design $311,604.48 $934,813.44
138 1.5.3.2.3 Detailed Design $1,817,692.80 $5,453,078.40
139 1.5.3.2.3.1 Hardware Interfaces $727,077.12 $2,181,231.36
140 1.5.3.2.3.1.1 Client Interface $363,538.56 $1,090,615.68
141 1.5.3.2.3.1.2 Server Interface $363,538.56 $1,090,615.68
142 1.5.3.2.3.2 Server/Client Link $1,090,615.68 $3,271,847.04
143 1.5.3.2.3.2.1 Security $1,090,615.68 $3,271,847.04
144 1.5.3.2.3.2.1.1 SSL for Server $545,307.84 $1,635,923.52

135
145 1.5.3.2.3.2.1.2 SSL for Client $545,307.84 $1,635,923.52
146 1.5.3.2.4 Code and Unit Testing $109,061.57 $327,184.71
147 1.5.3.2.5 Integration Testing $109,061.57 $327,184.71
148 1.5.3.2.6 System Test $109,061.57 $327,184.71
149 1.5.3.3 Word Processing $1,180,320.00 $3,540,960.00
150 1.5.3.3.1 Requirements Specification $63,737.28 $191,211.84
151 1.5.3.3.2 Preliminary Design $141,638.40 $424,915.20
152 1.5.3.3.3 Detailed Design $826,224.00 $2,478,672.00
153 1.5.3.3.3.1 Reporting $420,567.00 $1,261,701.00
154 1.5.3.3.3.1.1 User Interface $120,488.00 $361,464.00
155 1.5.3.3.3.1.2 Interface to Graphics Package $300,079.00 $900,237.00
156 1.5.3.3.3.2 Software $405,657.00 $1,216,971.00
157 1.5.3.3.3.2.1 User Interface $135,219.00 $405,657.00
158 1.5.3.3.3.2.2 Server Interface $135,219.00 $405,657.00
159 1.5.3.3.3.2.3 Module Interface $135,219.00 $405,657.00
160 1.5.3.3.4 Code and Unit Testing $49,573.44 $148,720.32
161 1.5.3.3.5 Integration Testing $49,573.44 $148,720.32
162 1.5.3.3.6 System Test $49,573.44 $148,720.32
163 1.5.3.4 Debugging/Testing $1,652,448.00 $4,957,344.00
164 1.5.3.4.1 Requirements Specification $89,232.18 $267,696.54
165 1.5.3.4.2 Preliminary Design $198,293.76 $594,881.28
166 1.5.3.4.3 Detailed Design $1,156,713.60 $3,470,140.80
167 1.5.3.4.3.1 Testing Interface $277,611.26 $832,833.78
168 1.5.3.4.3.1.1 User Interface $185,074.17 $555,222.51
169 1.5.3.4.3.1.2 Module Interface $92,537.09 $277,611.27
170 1.5.3.4.3.2 Hardware Interface $416,416.90 $1,249,250.70
171 1.5.3.4.3.2.1 User Interface $83,283.38 $249,850.14
172 1.5.3.4.3.2.2 Module Interface $83,283.38 $249,850.14
173 1.5.3.4.3.2.3 Server Interface $83,283.38 $249,850.14
174 1.5.3.4.3.2.4 RFID Interface $166,566.76 $499,700.28
175 1.5.3.4.3.2.4.1 Virtual Barcode Scanner $55,522.25 $166,566.75
176 1.5.3.4.3.2.4.2 RFID Virtual Interrogator $55,522.25 $166,566.75
177 1.5.3.4.3.2.4.3 RFID Virtual Tags $55,522.25 $166,566.75
178 1.5.3.4.3.3 Software Test Suite $462,685.44 $1,388,056.32
179 1.5.3.4.3.3.1 Requisition Test Logic $92,537.09 $277,611.27
180 1.5.3.4.3.3.2 Integration and System Test $92,537.09 $277,611.27
181 1.5.3.4.3.3.3 Security Test $92,537.09 $277,611.27

136
182 1.5.3.4.3.3.4 Failover Test $92,537.09 $277,611.27
183 1.5.3.4.3.3.5 Performance Test Logic $92,537.09 $277,611.27
184 1.5.3.4.4 Code and Unit Testing $69,402.82 $208,208.46
185 1.5.3.4.5 Integration Testing $69,402.82 $208,208.46
186 1.5.3.4.6 System Test $69,402.82 $208,208.46
187 1.5.4 OUTSOURCE $472,128.00 $1,416,384.00
188 1.5.4.1 Graphics $472,128.00 $1,416,384.00
189 1.5.4.1.1 Requirements $129,835.20 $389,505.60
190 1.5.4.1.1.1 Interview Users $94,425.60 $283,276.80
191 1.5.4.1.1.2 Write Requirements Document $35,409.60 $106,228.80
192 1.5.4.1.2 Preliminary Design $82,622.40 $247,867.20
193 1.5.4.1.2.1 Divide into Components $11,803.20 $35,409.60
194 1.5.4.1.2.2 Create Prototype $70,819.20 $212,457.60
195 1.5.4.1.3 Detailed Design $141,638.40 $424,915.20
196 1.5.4.1.3.1 Write Use Cases $70,819.20 $212,457.60
197 1.5.4.1.3.2 Create UML Diagrams $70,819.20 $212,457.60
198 1.5.4.1.4 Code and Unit Test $118,032.00 $354,096.00
199 1.6 Hardware $11,025,000.00 $33,075,000.00
200 1.7 Kernel $3,000,000.00 $9,000,000.00
201 1.8 Documentation $626,250.00 $1,878,750.00
202 1.9 Software Support Environment $389,046.00 $1,167,138.00
203 1.10 Post Deployment Software Support $2,162,000.00 $6,486,000.00
Table 34 – Budget & Schedule Allocation

137
Table 35 – Resource Allocation (Part I)

138
Table 36 – Resource Allocation (Part II)

139
Table 37 – Resource Allocation (Part II)

140
Figure 42 – Example Resource Allocation (Matt Henry, October 2007 – January 2008)

141
5.5 Project Schedule

Figure 43 - Schedule Estimate - Overview

142
Figure 44 - Schedule Estimate - Detail (Part I)

143
Figure 45 - Schedule Estimate - Detail (Part II)

144
Figure 46 - Schedule Estimate - Detail (Part III)

145
Figure 47 - Schedule Estimate - Detail (Part IV)

146
Figure 48 - Schedule Estimate – Resource List Detail

147
Appendix A: Zachman Enterprise Architecture Framework
Data (What) Function (How) Network (Where) People (Who) Time (When) Motivation (Why)
Scope Weapons, Keeping Marine Base Armory Clerks, Delivery of New Enforce Weapon Scope
Marines, Inventory of Locations (e.g. Staff Sergeant, Weapons, Accountability,
Arsenals Weapons Camp Pendleton) Junior Enlisted Disposal of Automate
Soldiers, Weapons, Troop Inventory Process
Supply and Exercises
Audit
Personnel
Enterprise Model Semantic ADR/Armory GIS Model of End-User Marine Corps Marine Arsenal Enterprise Model
(Conceptual) Model Use Case Marine Base Profile (Section Troop Inventory (Conceptual)
Diagram (Fig. 2) Locations and 1.1.2.3) Movement Plans, Regulations
ADR Acquisition and
Procurement of
Weapons Plans
System Model UML Model Use Cases System Client System Sequence List of Business System Model
(Logical) Architecture Screen Diagrams Rules and their (Logical)
Diagram (Fig. 1) Specifications relationships
Technology Model ER Diagram State Diagrams Network Diagram Presentation Communication Business Logic Technology Model
(Physical) Layer Layer Design Layer Design (Physical)
Prototype
Detailed Data C# Code Server Names, IP Windows Form Web Service Business Logic Detailed
Representation (Out- Definition Addresses, Application Method Dynamically Representation (Out-
of-Context) Language Protocols to Use Signatures, Loaded Library of-Context)
(DDL) T-SQL Server API (DLL)
Script
Functioning Functioning
Enterprise Data AWATS Networks USMC Schedules USMC Strategy Enterprise
Data (What) Function (How) Network (Where) People (Who) Time (When) Motivation (Why)
Table 38 – Zachman Enterprise Architecture Framework

148
Appendix B: Documentation Costs
Format Written By Pages Time to Initiate Who Approves Distribution Price
Corporation Format Marketing, AWATS PM 20 Before Project Initiates Senior Corporate Senior Corporate $3,000.00
Mngmt. Management, AWATS PM
DI-MCCR-80534 Systems Engineer, 150 After SPMP Software Quality AWATS PM, Package $22,500.00
(MIL-STD-498) Design Teams Control PM’s, Package Design
Teams
IEEE Std. 1058.1- AWATS PM, Package 300 Before Project Initiates Senior Corporate Entire Software Team $45,000.00
1987 PM’s, Assistant PM Mngmt.
IEEE Std 830-1984 Requirements Analyst, 230 After SPMP Subject Matter Expert, AWATS PM, Package $34,500.00
Subject Matter Expert Client PM, AWATS PM’s, Technical Team
PM Leader, Chief Programmer
DI-MCCR-80026 Tester 175 After SPMP Software Quality Programmers $26,250.00
Control
DI-MCCR-80027 Chief Programmer 225 After SRS Software Quality Programmers $33,750.00
Control
IEEE Std 1016-1998 Design Team 350 After SRS Chief Programmer Technical Team Leader, $52,500.00
Programmers, Testers
DI-IPSC-81439 Test and Evaluation 275 Before Development is Software Quality AWATS PM $55,000.00
(MIL-STD-498) Team Initiated Control
NASA-DID-M600 Configuration 45 After every software Assistant PM Documentation Team $6,750.00
Management Team build
DI-IPSC-81440 Test and Evaluation 125 After software tests are Assistant PM AWATS PM, Package $25,000.00
(MIL-STD-498) Team performed and after PM’s
build.
DI-IPSC-81444 Manual Team 290 After Detailed Design. Help Desk Team Help Desk Team, Client $174,000.00
(MIL-STD-498)
IEEE Std 1063-1987 Manual Team 180 After Detailed Design. Client Help Desk Team, Training $108,000.00
Team
DI-IPSC-81447 Manual Team, Chief 200 After Detailed Design. Technical Team Help Desk Team $40,000.00
(MIL-STD-498) Programmer Leader
          Total Cost: $626,250.00
Table 39 – Documentation Costs

149
Appendix C: Hardware Costs

Equipment Manufacturer Cost/Armory


2D Barcode Scanner Symbol $850.00
Passive RFID Tag Portal Symbol $12,000.00
Passive RFID Tags Alien $350.00
AWATS Client Hardware ABC $1,500.00
     
Cost per Armory:   $14,700.00
Adjusted (750 Armories):   $11,025,000.00
Table 40 – Hardware Costs

150
Appendix D: Helpdesk Costs
Resource Cost Remarks
Rotating staff of up to twenty (20) Phone Operators and six (6)
Staff $800,000.00 Engineers.
$1,200,000.0
Facility 0 Physical space for helpdesk/engineer staff.
Telephone
Lines $2,000.00 800 number which clients can call anytime for support.
Travel $60,000.00 Budget for up to four (CONUS) trips for an engineer per month.
Equipment $100,000.00 Laptops, Monitors, Headsets, etc.
     
$2,162,000.
Total Cost: 00  
Table 41 – Helpdesk Costs

151
Appendix E: Software Package Costs

Section Staff Months Staff Hours (160 * Staff Months)


Graphics 30.20 4832
Debug/Test 56.80 9088
Word Processor 55.10 8816
Communication 14.30 2288
Tracking Inventory 405.00 64800
GPS 8.90 1424
Requirements Management 30.00 4800
Spreadsheet 8.30 1328
Program Management 9.60 1536
Compiler 14.50 2320
Database 105.00 16800
     
Total Hours:   118032
Adjusted ($200/Hour):   $23,606,400.00
Table 42 – Software Package Costs

152
Appendix F: Software Tool Costs

  $ / License Licenses Total CostComments


Applications Tools     
Need 2 licenses to share the
Rational Rose $50,000.00 2$100,000.00workload
Need 2 licenses to share the
Rational Requisite Pro $15,000.00 2 $30,000.00workload
Need 2 licenses to share the
Rational Unified Process $45,000.00 2 $90,000.00workload
Microsoft Team Foundation
Server $109,000.00 1$109,000.00 
      
Design Tools     
MS SQL Management Studio $90,000.00 2$180,000.00 
Use the same licenses as in
Rational Rose $0.00 0 $0.00Application Tools
Use the same licenses as in
Rational Unified Process $0.00 0 $0.00Application Tools
CASE Tools $25,500.00 1 $25,500.00 
      
Code Tools     
Included in the VS 2005
C# $0.00 0 $0.00Team Suite
Assuming that all developers
Visual Studio 2005 Team Suite $13,000.00 13$169,000.00share 13 licenses
Rational ClearCase $20,000.00 2 $40,000.00 
Use the same licenses as in
Rational Unified Process $0.00 0 $0.00Application Tools
        
Unit Test Tools       
NUnit $0.00 50 $0.00Free Tools
Rational Robot $6,000.00 5 $30,000.00Need 5 clients to run the tests
Use the same licenses as in
Rational ClearCase $0.00 0 $0.00Code Tools
Use the same licenses as in
Rational Unified Process $0.00 0 $0.00Application Tools
Only need 1 TestManager to
Rational TestManager $15,000.00 1 $15,000.00manage tests
        
Integration Test Tools       
Rational TestManager $0.00 0 $0.00Use the same licenses as in

153
Unit Test Tools
Use the same licenses as in
Rational Robot $0.00 0 $0.00Unit Test Tools
Use the same licenses as in
Rational ClearCase $0.00 0 $0.00Code Tools
Use the same licenses as in
Rational Unified Process $0.00 0 $0.00Application Tools
        
        
Use the same licenses as in
System Test Tools $0.00 0 $0.00Unit Test Tools
Use the same licenses as in
Rational TestManager $0.00 0 $0.00Unit Test Tools
Use the same licenses as in
Rational Robot $0.00 0 $0.00Unit Test Tools
Use the same licenses as in
Rational ClearCase $0.00 0 $0.00Code Tools
Use the same licenses as in
Rational Unified Process $0.00 0 $0.00Application Tools
        
        
Deployment Tools       
Symantec Norton Antivirus $10.00 800 $8,000.00 
The Shield Pro $13.00 800 $10,400.00 
Bullguard Internet Security $8.00 800 $6,400.00 
EZAntivirus $6.00 800 $4,800.00 
BitDefender 9 $5.00 800 $4,000.00 
        
Maintenance Tools       
Parature, Inc. Help Desk Software $500.00 10 $5,000.00 
Laplink $4.00 800 $3,200.00 
        
Disposal Tools       
Dell Computer Recycling Service $0.00 800 $0.00Free service provided by Dell
        
   Total Cost: $389,046.00 
Table 43 – Software Tool Costs

154
Appendix G: Cost Certainty
As with any major project undertaking, it is important to approach the work with as much cost
certainty as possible. This information can be derived from several metrics; however, upper
management in this case called for a statistical analysis based on a projected earned value between the
20% and 40% completion milestones. This will enable upper management to set aside any reserves
necessary for unforeseen problems, and it will also set an expectation for profit in terms of the total
system cost.

Below are the results of the statistical analysis:

Analysis Analysis Project Manager’s


Phase I Phase II Estimate of Final Cost
Dollars

0% - 20% 20% - 40% 40% - 100%

Actual Costs
Earned Value
**Note: This region
(60%) marks an
opportunity for
improvement.

Time

As noted in the analysis, there are evaluations in two distinct (and early) phases of AWATS
development. After completing 20% of the project, AWATS is projected to be within budget with a
high rate of earned value. The projections further predict a continuing trend all the way toward the
40% completion point, with an even higher rate of certainty. The analysis confirms that AWATS is
extremely likely to be completed on budget and with the expected profit margin.

This analysis leaves the remaining 60% as an opportunity for improvement. Should it be necessary to
adjust the schedule or budget, this region is where it will likely occur. Due to the success of our
previous project analyses, and our adherence to CMMI Level 3 best practices, we are confident that the
schedule will hold and reserves will remain intact.

155
Appendix H: Documentation Process

Document Production Process Flow Diagram


Automated Weapons Accountability and Tracking System
(AWATS)
Team Hoodwick
Lina Ciarleglio
Tuyen Dam
Matt Henry
Leonard Woody III

Committee of SME’s,
Technical Writer
Assigned Authors, and Technical Writer
Interviews SME’s
Technical Writer is Creates Document
to Get Relevant
Established for Outline
Information
Document

Disapproves

Committee
Reviews
Outline

Approves

Technical Writer
Creates Draft

Disapproves

Disapproves
Committee
Reviews Draft

Approves Disapproves

QA Reviews
Document

Approves

Put into CM and Customer


Version Document Reviews

Approves

Distribute Finished
Document

Figure 49 - Documentation Process

156
Appendix I: Product Assurance Documentation TOC
1. Introduction
1.1 Purpose
1.2 Scope
1.3 Software Items Covered

2. Reference Documents

3. Management
3.1 Organization
3.2 Tasks
3.3 Responsibilities

4. Documentation
4.1 Purpose
4.2 Minimum Documentation Requirements
4.2.1 Software Requirements Specification
4.2.2 Software Design Description
4.2.3 Software Verification and Validation Plan
4.2.4 Software Verification and Validation Report
4.2.5 User Documentation

4.3 Other Documentation


4.3.1 Software Development Plan
4.3.2 Software Configuration Management Plan
4.3.3 Standards and Procedures Manual

5. Standard, Practices, and Conventions


5.1 Purpose
5.2 Contents
5.2.1 Documentation Standards
5.2.2 Logic Structure Standards
5.2.3 Coding Standards
5.2.4 Commentary Standards

6. Reviews and Audits


6.1 Purpose
6.2 Minimum Requirements
6.2.1 Software Requirements Review
6.2.2 Preliminary Design Review
6.2.3 Critical Design Review
6.2.4 Software Verification and Validation Review
6.2.5 Functional Audit

157
6.2.6 Physical Audit
6.2.7 In-Process Audits
6.2.7.1 Code vs. design documentation
6.2.7.2 Interface specifications
6.2.7.3 Design specs vs. functional requirements
6.2.7.4 Functional requirements vs. test plan
6.2.8 Managerial Reviews

7. Software Configuration Management Plan


7.1 Identification of Software Items
7.2 Change Control
7.3 Reporting Status Changes

8. Problem Reporting and Corrective Action


8.1 Practices and Procedures
8.2 Organizational Responsibilities

9. Tools, Techniques, and Methodologies

10. Code Control

11. Media Control

12. Supplier Control

13. Records Collection, Maintenance, and Retention

158
Appendix J: Complete Work Breakdown Structure (WBS)
Work Package Specification: COMPILE/LINK/RUNTIME-CR-001

Activity Number: 1.5.1.2.3.1.4


Activity Name: COMPILE/LINK/RUNTIME-CR-001
Feature Description: Configure/set up a CRMP Configuration Management Interface
for the project members to keep the software baseline and updates in synchronizations.
Activity Description: Configure a CRMP CM Interface to provide a stable environment
for all team members, especially the developers, to develop and integrate all AWATS
software modules.
Estimated Duration: 1 Weeks
Resources Needed:
Personnel: 1 Requirements Analyst, 1 Systems Engineer/Analyst, 1 Programmer,
1 RFID Engineer
Skills: Development environment configuration knowledge, CM knowledge
Tools: Microsoft Visual Studio 2005
Travel: None
Work Product: Configuration of the CRMP CM interface to link up the compiled
AWATS modules with the CM repository.
Risks: Lack of knowledge in linking up between the source codes to the CM repository
can cause loss of source codes.
Predecessors: Completion of MS Visual Studio 2005 development environment user
interface.
Completion Criteria: Source code can be used, stored, and checked in MS Visual Studio
2005

IMPLEMENTATION
Personnel Assigned:
Starting Date:
Completion Date:
Costs (budgeted/Total): $26,276.25/$78,828.75
Legacy Comments:

159
Work Package Specification: PROJECT MANAGEMENT-PM-001

Activity Number: 1.5.1.3.3.1.1


Activity Name: PROJEC MANAGEMENT-PM-001
Feature Description: Configure the graphical interface on the project management tool
to work with the graphical presentation package.
Activity Description: Create a the graphical interface on Microsoft Project 2003 for
importing and exporting graphics and charts for the project schedules as well as the
Project Manager’s project reports.
Estimated Duration: 2 Weeks
Resources Needed:
Personnel: 1 Requirements Analyst, 1 Programmer, 2 RFID Engineer, Project
Manager
Skills: Microsoft Project knowledge, project management knowledge, the
outsourced graphic package knowledge
Tools: MS Project 2003
Travel: 10% for the software engineer from Ivan Industries, Inc.
Work Product: The graphic interface implemented integrated with MS Project 2003.
Risks: schedule slip from the outsourced vendor Ivan Industries, Inc.
Predecessors: Microsoft Project 2003 configured successfully.
Completion Criteria: Successfully integration of the graphic presentation package into
the PM package.

IMPLEMENTATION
Personnel Assigned:
Starting Date:
Completion Date:
Costs (budgeted/Total): $35,035.00/$105,105.00.00
Legacy Comments:

160
Work Package Specification: DATABASE-DD-001

Activity Number: 1.5.1.2.3.1.3


Activity Name: DATABASE-DD-001
Feature Description: Configure a graphical interface to create, retrieve, update, and
delete RFIG tag, weapon and Marines data in the AWATS database.
Activity Description: Configure the user interfaces on both the user and the database
sides allowing accurate database handling and user-friendly.
Estimated Duration: 2 Weeks
Resources Needed:
Personnel: 1 Requirements Analyst, 1 Systems Engineer/Analyst, 1 Database
Administrator, 1 Programmer, 1 RFID Engineer
Skills: User interface configuration knowledge, requirements knowledge,
database setting-up and tuning knowledge
Tools: Microsoft SQL Server 2005, User Interface Builder
Travel: None
Work Product: Implementation of a database user interface configuration to link up the
COTS database to
Risks: Incorrect configurations will cause slow flowing of data transactions or even loss
of data.
Predecessors: Hardware interface is completed.
Completion Criteria: Approval of a set of configurations that will be used on the
AWATS system to interface between the database and the users.

IMPLEMENTATION

Personnel Assigned:
Starting Date:
Completion Date:
Costs (Budgeted/Total): $300,981.60/$902,944.80
Legacy Comments:

161
Work Package Specification: SPREADSHEET-SS-001

Activity Number: 1.5.2.1.3.1.2


Activity Name: SPREADSHEET-SS-001
Feature Description: Memory Scanning Algorithm
Activity Description: Implement an algorithm to communicate with hardware (memory)
in order to boost efficiency for applications requiring access to proprietary AWATS items
in memory.
Estimated Duration: 6 Weeks
Resources Needed:
Personnel: 1 Requirements Analyst, 1 Chief Programmer, 1 Programmer, 2
Testers
Skills: Assembly Code, Memory, Hardware
Tools: Assembly Code Debugger, Memory Core Monitor
Travel: None
Work Product: Functional Memory Scanning Algorithm which can efficiently organize
system memory.
Risks: Algorithm may function differently between memory manufacturers.
Predecessors: None
Completion Criteria: Approval of functioning algorithm used on test hardware (similar
or equivalent to production quality).

IMPLEMENTATION

Personnel Assigned:
Starting Date:
Completion Date:
Costs (Budgeted/Total): $95,181.01/$285,543.03
Legacy:
Comments:

162
Work Package Specification: REQUIREMENTSMGMT-RM-001

Activity Number: 1.5.2.2.3.3.1


Activity Name: REQUIREMENTSMGMT-RM-001
Feature Description: Communication to the (Electronic Product Code) EPC web service
Activity Description: Construction of a web service module which is capable of
querying data from EPC
Estimated Duration: 2 Weeks
Resources Needed:
Personnel: 1 Requirements Analyst, 1 Technical Team Leader, 1 Programmer, 1
Tester
Skills: .NET Web Services, XML
Tools: Visual Studio 2005
Travel: None
Work Product: Web Service capable of data transfer with the International EPC server.
Risks: Web Service may experience downtime for maintenance.
Predecessors: None
Completion Criteria: Finished Web Service passes all unit tests as well as functional
tests administered by test staff.

IMPLEMENTATION

Personnel Assigned:
Starting Date:
Completion Date:
Costs (Budgeted/Total): $368,259.84/$1,104,779.52
Legacy Comments:

163
Work Package Specification: GPS-G-001

Activity Number: 1.5.2.3.3.2.1


Activity Name: GPS-G-001
Feature Description: Database of GPS data
Activity Description: Gather and populate GPS data
Estimated Duration: 8 Weeks
Resources Needed:
Personnel: 1 Requirements Analyst, 1 Technical Team Leader, 2 Programmers, 1 QA
Specialist, 1 Tester
Skills: GPS, Math, Data Entry
Tools: GPS Mapping Software
Travel: None
Work Product: Database capable of mapping GPS input to map image.
Risks: Bad data entry.
Predecessors: None
Completion Criteria: Final database will be extensively tested using random queries spanning every
100 square miles of the globe.

IMPLEMENTATION

Personnel Assigned:
Starting Date:
Completion Date:
Costs (Budgeted/Total): $76,484.74/ $229,454.22
Legacy Comments:

164
Work Package Specification: ELECTRONIC INVENTORY-EY-001

Activity Number: 1.5.3.1.1.1


Activity Name: ELECTRONIC INVENTORY-EY-001
Feature Description: Interview the Marines on how to keep track of the weaponry.
Activity Description: Create sets of questionnaires and user surveys to gather requirements on how to
keep track of the weaponry for the RFID Virtual Tags
Estimated Duration: 5 Weeks
Resources Needed:
Personnel: 3 Requirements Analyst, 1 Systems Engineer/Analyst, 3 RFID Engineers, 1 RFID
Integration Manger
Skills: Requirements gathering, user interviewing techniques, requirements knowledge, RFID
knowledge, user surveys and questionnaires knowledge
Tools: In-house tool.
Travel: 60%
Work Product: AWATS requirements document on the RFID Virtual tags
Risks: Gathering the incorrect user needs; Users not available.
Predecessors: Users identified.
Completion Criteria: Approval of the requirements document.

IMPLEMENTATION
Personnel Assigned:
Starting Date:
Completion Date:
Costs (budgeted/Total): $396,587.52/$1,189,762.56
Legacy Comments:

165
Work Package Specification: COMMUNICATION-CM-001

Activity Number: 1.5.3.2.3.1.2


Activity Name: COMMUNICATION-CM-001
Feature Description: Create a graphical interface for communications at the AWATS server site
Activity Description: Create a detailed design and mockup of server interface used for secure
communication
Estimated Duration: 3 Weeks
Resources Needed:
Personnel: 1 Requirements Analyst, 1 Systems Engineer/Analyst, 1 Programmer, 1 RFID
Engineer
Skills: User interface design knowledge, requirements knowledge, Secure Socket Layer (SSL)
knowledge
Tools: Rational Suite, User Interface Builder
Travel: None
Work Product: Detailed design and mockup of user interface
Risks: RFID complications
Predecessors: Preliminary design of client interface (done in parallel)
Completion Criteria: Approval of detailed design

IMPLEMENTATION
Personnel Assigned:
Starting Date:
Completion Date:
Costs (budgeted/Total): $363,538.56/$1,090,615.68
Legacy Comments:

166
Work Package Specification: WORDPROCESSING-WP-001

Activity Number: 1.5.3.3.3.1.2


Activity Name: WORDPROCESSING-WP-001
Feature Description: Create a link/interface to the graphics package developed by Ivan Industries
Activity Description: Create a detailed design and mockup of link/interface to the graphics package
for reporting capabilities
Estimated Duration: 4 Weeks
Resources Needed:
Personnel: 1 Requirements Analyst, 1 Systems Engineer/Analyst, 1 Programmer, 1 RFID
Engineer
Skills: User interface design knowledge, requirements knowledge
Tools: Rational Suite, User Interface Builder
Travel: 20%
Work Product: Detailed design and mockup of link/interface to graphics package
Risks: Time constraints: word processing package and graphics package (developed by Ivan
Industries) are developed almost in parallel, any setback with the graphics package may directly affect
development of the word processing package.
Predecessors: Preliminary design of reporting user interface within word processing package
Completion Criteria: Approval of word processing detailed design, approval of detailed design for
graphics package

IMPLEMENTATION
Personnel Assigned:
Starting Date:
Completion Date:
Costs (budgeted/Total): $300,079.00/$900,237.00
Legacy Comments:

167
Work Package Specification: DEBUGGING-DB-001

Activity Number: 1.5.3.4.3.2.4.3


Activity Name: DEBUGGING-DB-006
Feature Description: Create a graphical interface to the RFID Virtual Tags
Activity Description: Create a detailed design and mockup of interface used for the RFID Virtual
Tags
Estimated Duration: 5 Weeks
Resources Needed:
Personnel: 1 Requirements Analyst, 1 Systems Engineer/Analyst, 1 Programmer, 2 RFID
Engineer, 1 RFID Integration Manger
Skills: User interface design knowledge, requirements knowledge, RFID knowledge
Tools: Rational Suite, User Interface Builder
Travel: None
Work Product: Detailed design and mockup of interface to the RFID Virtual tags
Risks: RFID Complications
Predecessors: Preliminary design of other interfaces (User, Server, and Module)
Completion Criteria: Approval of detailed design

IMPLEMENTATION
Personnel Assigned:
Starting Date:
Completion Date:
Costs (budgeted/Total): $55,522.25/$166,566.75
Legacy Comments:

168
Work Package Specification: GRAPHIC-GP-001

Activity Number: 1.5.4.1.3.1


Activity Name: GRAPHIC-GP-001
Feature Description: Write use cases for the detailed design on the Graphic Presentation Package.
Activity Description: Write use cases that will benefit both the graphic developers as well as the
AWATS developers.
Estimated Duration: 7 Weeks
Resources Needed:
Personnel: 2 Requirements Analyst, 4 Programmers (including 3 Ivan Industries)
Skills: Graphic designer skills, User interface design knowledge
Tools: None
Travel: 15%
Work Product: Detailed design and mockup of interfaces on the Graphic Package.
Risks: Schedule delay due to the delivery of the graphic interfaces.
Predecessors: Preliminary design of other interfaces (User, Server, and Module)
Completion Criteria: Approval of detailed design

IMPLEMENTATION
Personnel Assigned:
Starting Date:
Completion Date:
Costs (budgeted/Total): $70,819.20/$212,457.60
Legacy Comments:

169
Appendix K: COCOMO II Complete Output

__ _____________ _________________
/ / / ____ ___/ / ___ ____ ___/
/ / / / /_ / / _____ / / / /_ / /__
/ / / /\__ \/ / /____/ / / \__ \/ ___/
/ /_/ /___/ / /___ / /_______/ / /___
\____/\____/\____/ \____/\____/\____/

_______________________________________________
/ ___ / \ ___ / \ _ _ / \ | | | |
/ / / __ / / / __ / / \/ / / __ / | | | |
/ / / /_/ / / / /_/ / / / / /_/ / | | | |
/ /___/ / /___/ / / / / / | | | |
\____/\____/\____/\____/\/ /_/\____/ _|_|_|_|_

170
##USC COCOMO II.1999.0##

DEVELOPMENT MODEL: Post Architecture Model


Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

Scale Factors: PREC FLEX RESL TEAM PMAT


N N N N N
2.43 3.64 2.53 2.97 2.73

NOM ACT RATE


Module Effort Effort PROD & INST
Module Name Size EAF DEV DEV COST COST Staff RISK

SLOC 0.0
General Purpose Database 120000 1.00 860.3 860.3 139.5 0 0.0 12.9 0.0

SLOC 0.0
Spreadsheet 65000 1.00 466.0 466.0 139.5 0 0.0 7.0 0.0

SLOC 0.0
Configuration/Requirements Mgm 78000 1.00 559.2 559.2 139.5 0 0.0 8.4 0.0

SLOC 0.0
Communications 62000 1.00 444.5 444.5 139.5 0 0.0 6.7 0.0

SLOC 0.0
Graphics Presentation 110000 1.00 788.6 788.6 139.5 0 0.0 11.8 0.0

SLOC 0.0
Word Processing 65000 1.00 466.0 466.0 139.5 0 0.0 7.0 0.0

SLOC 0.0
Project Management 56000 1.00 401.5 401.5 139.5 0 0.0 6.0 0.0

SLOC 0.0
GPS Navigation 74000 1.00 530.5 530.5 139.5 0 0.0 7.9 0.0

SLOC 0.0
Compile/Link/Runtime 56000 1.00 401.5 401.5 139.5 0 0.0 6.0 0.0

SLOC 0.0

171
Debugging/Testing 65000 1.00 466.0 466.0 139.5 0 0.0 7.0 0.0

SLOC 0.0
Electronic Inventory/Tracking 365000 1.00 2616.7 2616.7 139.5 0 0.0 39.2 0.0

TOTAL SLOC 1116000 OPTIMISTIC 6400.5 174.4 0 0.0 103.9 0.0


MOST LIKELY8000.6 139.5 0 0.0 119.8 0.0
PESSIMISTIC10000.8 111.6 0 0.0 138.3 0.0

OPTIMISTIC MOST LIKELY PESSIMISTIC


SCHEDULE 61.6 66.8 72.3

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE NAME SIZING DETAILS

General Purpose Data BRAK INITSLOC


SLOC 0 120000

Spreadsheet BRAK INITSLOC


SLOC 0 65000

Configuration/Requir BRAK INITSLOC


SLOC 0 78000

Communications BRAK INITSLOC


SLOC 0 62000

Graphics Presentatio BRAK INITSLOC


SLOC 0 110000

Word Processing BRAK INITSLOC


SLOC 0 65000

Project Management BRAK INITSLOC


SLOC 0 56000

GPS Navigation BRAK INITSLOC


SLOC 0 74000

172
Compile/Link/Runtime BRAK INITSLOC
SLOC 0 56000

Debugging/Testing BRAK INITSLOC


SLOC 0 65000

Electronic Inventory BRAK INITSLOC


SLOC 0 365000

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

PROJECT PHASE & ACTIVITY INFORMATION


========================================

Overall Phase Distribution


=========================================================================
=====
PROJECT AWATS
SLOC 1116000
TOTAL EFFORT 8000.606 Person Months
=========================================================================
=====
PCNT EFFORT (PM) PCNT SCHEDULE Staff
Plans And Requirements 7.000 560.042 22.406 14.959 37.439
Product Design 17.000 1360.103 27.203 18.162 74.889
Programming 54.392 4351.660 43.189 28.835 150.918
- Detailed Design 23.797 1903.921 ---- ---- ----
- Code and Unit Test 30.594 2447.739 ---- ---- ----
Integration and Test 28.608 2288.843 29.608 19.768 115.787

=========================================================================
=====
Life Cycle Phase Plans And Requirements
Life Cycle Effort 560.042 Person Months
Life Cycle Schedule 14.959 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 44.797 250.883 14.959 16.772
Product Design 17.601 98.575 14.959 6.590
Programming 5.703 31.938 14.959 2.135

173
Test Planning 4.101 22.970 14.959 1.536
Verification and Validation 7.601 42.571 14.959 2.846
Project Office 12.297 68.870 14.959 4.604
CM/QA 2.899 16.233 14.959 1.085
Manuals 5.000 28.002 14.959 1.872

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

PROJECT PHASE & ACTIVITY INFORMATION


========================================

=========================================================================
=====
Life Cycle Phase Product Design
Life Cycle Effort 1360.103 Person Months
Life Cycle Schedule 18.162 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 12.500 170.013 18.162 9.361
Product Design 41.000 557.642 18.162 30.704
Programming 13.601 184.993 18.162 10.186
Test Planning 6.101 82.985 18.162 4.569
Verification and Validation 7.601 103.387 18.162 5.693
Project Office 9.797 133.252 18.162 7.337
CM/QA 2.399 32.623 18.162 1.796
Manuals 7.000 95.207 18.162 5.242

=========================================================================
=====
Life Cycle Phase Programming
Life Cycle Effort 4351.660 Person Months
Life Cycle Schedule 28.835 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 4.000 174.066 28.835 6.037
Product Design 8.000 348.133 28.835 12.073
Programming 56.500 2458.688 28.835 85.269
Test Planning 5.601 243.754 28.835 8.454
Verification and Validation 8.601 374.303 28.835 12.981

174
Project Office 5.899 256.687 28.835 8.902
CM/QA 6.399 278.445 28.835 9.657
Manuals 5.000 217.583 28.835 7.546

=========================================================================
=====
Life Cycle Phase Integration and Test
Life Cycle Effort 2288.843 Person Months
Life Cycle Schedule 19.768 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 2.500 57.221 19.768 2.895
Product Design 5.000 114.442 19.768 5.789
Programming 39.406 901.932 19.768 45.627
Test Planning 3.101 70.986 19.768 3.591
Verification and Validation 28.196 645.358 19.768 32.647
Project Office 6.899 157.898 19.768 7.988
CM/QA 7.899 180.787 19.768 9.146
Manuals 7.000 160.219 19.768 8.105

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

PROJECT MAINTENANCE INFORMATION


=============================================

PROJECT AWATS
Development SLOC 1116000
Development Nominal Person-Month 8000.61
Development Adjusted Person-Month 8000.61
Development Cost 0.00
Maintenance Labor Rate 0.00
Percentage Added 0.00%
Percentage Modified 0.00%
Maintenance Software Understanding 30.00
Unfamiliarity with the Software 0.40

Annual Change Traffic 0.00%

Effort Adjustment Factor 1.00

175
+ Product ++ Platform +
RELY DATA DOCU CPLX RUSE TIME STOR PVOL
Nominal Nominal Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0% 0% 0%
+ Personnel +
ACAP AEXP PCAP PEXP LTEX PCON
Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0%
+ Project + User +
TOOL SITE USR1 USR2
Nominal Nominal Nominal Nominal
0% 0% 0% 0%

WORK LOAD PROJECTION FOR THE NEXT 1 YEARS


Year KSLOC Effort_Nom Effort_Maint Staff KSLOC/Staff Cost
1 1116.00 0.00 0.00 0.00 0.00 0.00

SUMMARY OF PROJECTION
Cumulative Maintenance Person-month 0.00
Overall Development and Maintenance Person-month 8000.61
Cumulative Maintenance Cost 0.00
Overall Development and Maintenance Cost 0.00

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

+ Product ++ Platform +
MODULE NAME RELY DATA DOCU CPLX RUSE TIME STOR PVOL
N N N N N N N N
General Purpose Data 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 + Personnel ++
Project ++ User +
MODULE NAME ACAP AEXP PCAP PEXP LEXP PCON TOOL SCED SITE USR1
USR2

N N N N N N N N N N N
General Purpose Data 0% 0% 0% 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00

MODULE PHASE & ACTIVITY INFORMATION


========================================

176
Overall Phase Distribution
=========================================================================
=====
MODULE General Purpose Data
SLOC 120000
TOTAL EFFORT 860.280 Person Months
=========================================================================
=====
PCNT EFFORT (PM) PCNT SCHEDULE Staff
Plans And Requirements 7.000 60.220 22.406 14.959 4.026
Product Design 17.000 146.248 27.203 18.162 8.053
Programming 54.392 467.920 43.189 28.835 16.228
- Detailed Design 23.797 204.723 ---- ---- ----
- Code and Unit Test 30.594 263.198 ---- ---- ----
Integration and Test 28.608 246.112 29.608 19.768 12.450

=========================================================================
=====
Life Cycle Phase Plans And Requirements
Life Cycle Effort 60.220 Person Months
Life Cycle Schedule 14.959 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 44.797 26.977 14.959 1.803
Product Design 17.601 10.599 14.959 0.709
Programming 5.703 3.434 14.959 0.230
Test Planning 4.101 2.470 14.959 0.165
Verification and Validation 7.601 4.578 14.959 0.306
Project Office 12.297 7.405 14.959 0.495
CM/QA 2.899 1.746 14.959 0.117
Manuals 5.000 3.011 14.959 0.201

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE PHASE & ACTIVITY INFORMATION


========================================

=========================================================================
=====

177
Life Cycle Phase Product Design
Life Cycle Effort 146.248 Person Months
Life Cycle Schedule 18.162 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 12.500 18.281 18.162 1.007
Product Design 41.000 59.962 18.162 3.302
Programming 13.601 19.892 18.162 1.095
Test Planning 6.101 8.923 18.162 0.491
Verification and Validation 7.601 11.117 18.162 0.612
Project Office 9.797 14.328 18.162 0.789
CM/QA 2.399 3.508 18.162 0.193
Manuals 7.000 10.237 18.162 0.564

=========================================================================
=====
Life Cycle Phase Programming
Life Cycle Effort 467.920 Person Months
Life Cycle Schedule 28.835 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 4.000 18.717 28.835 0.649
Product Design 8.000 37.434 28.835 1.298
Programming 56.500 264.375 28.835 9.169
Test Planning 5.601 26.210 28.835 0.909
Verification and Validation 8.601 40.248 28.835 1.396
Project Office 5.899 27.601 28.835 0.957
CM/QA 6.399 29.940 28.835 1.038
Manuals 5.000 23.396 28.835 0.811

=========================================================================
=====
Life Cycle Phase Integration and Test
Life Cycle Effort 246.112 Person Months
Life Cycle Schedule 19.768 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 2.500 6.153 19.768 0.311
Product Design 5.000 12.306 19.768 0.623
Programming 39.406 96.982 19.768 4.906
Test Planning 3.101 7.633 19.768 0.386
Verification and Validation 28.196 69.393 19.768 3.510
Project Office 6.899 16.978 19.768 0.859

178
CM/QA 7.899 19.439 19.768 0.983
Manuals 7.000 17.228 19.768 0.872

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE MAINTENANCE INFORMATION


=============================================

MODULE General Purpose Data


Development SLOC 120000
Development Nominal Person-Month 860.28
Development Adjusted Person-Month 860.28
Development Cost 0.00
Maintenance Labor Rate 0.00
Percentage Added 0.00%
Percentage Modified 0.00%
Maintenance Software Understanding 30.00
Unfamiliarity with the Software 0.40

Annual Change Traffic 0.00%

Effort Adjustment Factor 1.00

+ Product ++ Platform +
RELY DATA DOCU CPLX RUSE TIME STOR PVOL
Nominal Nominal Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0% 0% 0%
+ Personnel +
ACAP AEXP PCAP PEXP LTEX PCON
Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0%
+ Project + User +
TOOL SITE USR1 USR2
Nominal Nominal Nominal Nominal
0% 0% 0% 0%

WORK LOAD PROJECTION FOR THE NEXT 1 YEARS


Year KSLOC Effort_Nom Effort_Maint Staff KSLOC/Staff Cost
1 120.00 0.00 0.00 0.00 0.00 0.00

179
SUMMARY OF PROJECTION
Cumulative Maintenance Person-month 0.00
Overall Development and Maintenance Person-month 860.28
Cumulative Maintenance Cost 0.00
Overall Development and Maintenance Cost 0.00

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

+ Product ++ Platform +
MODULE NAME RELY DATA DOCU CPLX RUSE TIME STOR PVOL
N N N N N N N N
Spreadsheet 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 + Personnel ++
Project ++ User +
MODULE NAME ACAP AEXP PCAP PEXP LEXP PCON TOOL SCED SITE USR1
USR2

N N N N N N N N N N N
Spreadsheet 0% 0% 0% 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00

MODULE PHASE & ACTIVITY INFORMATION


========================================

Overall Phase Distribution


=========================================================================
=====
MODULE Spreadsheet
SLOC 65000
TOTAL EFFORT 465.985 Person Months
=========================================================================
=====
PCNT EFFORT (PM) PCNT SCHEDULE Staff
Plans And Requirements 7.000 32.619 22.406 14.959 2.181
Product Design 17.000 79.217 27.203 18.162 4.362
Programming 54.392 253.457 43.189 28.835 8.790
- Detailed Design 23.797 110.891 ---- ---- ----
- Code and Unit Test 30.594 142.565 ---- ---- ----
Integration and Test 28.608 133.311 29.608 19.768 6.744

180
=========================================================================
=====
Life Cycle Phase Plans And Requirements
Life Cycle Effort 32.619 Person Months
Life Cycle Schedule 14.959 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 44.797 14.612 14.959 0.977
Product Design 17.601 5.741 14.959 0.384
Programming 5.703 1.860 14.959 0.124
Test Planning 4.101 1.338 14.959 0.089
Verification and Validation 7.601 2.479 14.959 0.166
Project Office 12.297 4.011 14.959 0.268
CM/QA 2.899 0.945 14.959 0.063
Manuals 5.000 1.631 14.959 0.109

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE PHASE & ACTIVITY INFORMATION


========================================

=========================================================================
=====
Life Cycle Phase Product Design
Life Cycle Effort 79.217 Person Months
Life Cycle Schedule 18.162 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 12.500 9.902 18.162 0.545
Product Design 41.000 32.479 18.162 1.788
Programming 13.601 10.775 18.162 0.593
Test Planning 6.101 4.833 18.162 0.266
Verification and Validation 7.601 6.022 18.162 0.332
Project Office 9.797 7.761 18.162 0.427
CM/QA 2.399 1.900 18.162 0.105
Manuals 7.000 5.545 18.162 0.305

=========================================================================
=====
Life Cycle Phase Programming

181
Life Cycle Effort 253.457 Person Months
Life Cycle Schedule 28.835 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 4.000 10.138 28.835 0.352
Product Design 8.000 20.277 28.835 0.703
Programming 56.500 143.203 28.835 4.966
Test Planning 5.601 14.197 28.835 0.492
Verification and Validation 8.601 21.801 28.835 0.756
Project Office 5.899 14.950 28.835 0.518
CM/QA 6.399 16.218 28.835 0.562
Manuals 5.000 12.673 28.835 0.440

=========================================================================
=====
Life Cycle Phase Integration and Test
Life Cycle Effort 133.311 Person Months
Life Cycle Schedule 19.768 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 2.500 3.333 19.768 0.169
Product Design 5.000 6.666 19.768 0.337
Programming 39.406 52.532 19.768 2.657
Test Planning 3.101 4.134 19.768 0.209
Verification and Validation 28.196 37.588 19.768 1.901
Project Office 6.899 9.197 19.768 0.465
CM/QA 7.899 10.530 19.768 0.533
Manuals 7.000 9.332 19.768 0.472

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE MAINTENANCE INFORMATION


=============================================

MODULE Spreadsheet
Development SLOC 65000
Development Nominal Person-Month 465.99
Development Adjusted Person-Month 465.99

182
Development Cost 0.00
Maintenance Labor Rate 0.00
Percentage Added 0.00%
Percentage Modified 0.00%
Maintenance Software Understanding 30.00
Unfamiliarity with the Software 0.40

Annual Change Traffic 0.00%

Effort Adjustment Factor 1.00

+ Product ++ Platform +
RELY DATA DOCU CPLX RUSE TIME STOR PVOL
Nominal Nominal Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0% 0% 0%
+ Personnel +
ACAP AEXP PCAP PEXP LTEX PCON
Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0%
+ Project + User +
TOOL SITE USR1 USR2
Nominal Nominal Nominal Nominal
0% 0% 0% 0%

WORK LOAD PROJECTION FOR THE NEXT 1 YEARS


Year KSLOC Effort_Nom Effort_Maint Staff KSLOC/Staff Cost
1 65.00 0.00 0.00 0.00 0.00 0.00

SUMMARY OF PROJECTION
Cumulative Maintenance Person-month 0.00
Overall Development and Maintenance Person-month 465.99
Cumulative Maintenance Cost 0.00
Overall Development and Maintenance Cost 0.00

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

+ Product ++ Platform +
MODULE NAME RELY DATA DOCU CPLX RUSE TIME STOR PVOL
N N N N N N N N
Configuration/Requir 0% 0% 0% 0% 0% 0% 0% 0%

183
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 + Personnel ++
Project ++ User +
MODULE NAME ACAP AEXP PCAP PEXP LEXP PCON TOOL SCED SITE USR1
USR2

N N N N N N N N N N N
Configuration/Requir 0% 0% 0% 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00

MODULE PHASE & ACTIVITY INFORMATION


========================================

Overall Phase Distribution


=========================================================================
=====
MODULE Configuration/Requir
SLOC 78000
TOTAL EFFORT 559.182 Person Months
=========================================================================
=====
PCNT EFFORT (PM) PCNT SCHEDULE Staff
Plans And Requirements 7.000 39.143 22.406 14.959 2.617
Product Design 17.000 95.061 27.203 18.162 5.234
Programming 54.392 304.148 43.189 28.835 10.548
- Detailed Design 23.797 133.070 ---- ---- ----
- Code and Unit Test 30.594 171.079 ---- ---- ----
Integration and Test 28.608 159.973 29.608 19.768 8.093

=========================================================================
=====
Life Cycle Phase Plans And Requirements
Life Cycle Effort 39.143 Person Months
Life Cycle Schedule 14.959 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 44.797 17.535 14.959 1.172
Product Design 17.601 6.890 14.959 0.461
Programming 5.703 2.232 14.959 0.149
Test Planning 4.101 1.605 14.959 0.107
Verification and Validation 7.601 2.975 14.959 0.199
Project Office 12.297 4.813 14.959 0.322
CM/QA 2.899 1.135 14.959 0.076
Manuals 5.000 1.957 14.959 0.131

184
Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE PHASE & ACTIVITY INFORMATION


========================================

=========================================================================
=====
Life Cycle Phase Product Design
Life Cycle Effort 95.061 Person Months
Life Cycle Schedule 18.162 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 12.500 11.883 18.162 0.654
Product Design 41.000 38.975 18.162 2.146
Programming 13.601 12.930 18.162 0.712
Test Planning 6.101 5.800 18.162 0.319
Verification and Validation 7.601 7.226 18.162 0.398
Project Office 9.797 9.313 18.162 0.513
CM/QA 2.399 2.280 18.162 0.126
Manuals 7.000 6.654 18.162 0.366

=========================================================================
=====
Life Cycle Phase Programming
Life Cycle Effort 304.148 Person Months
Life Cycle Schedule 28.835 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 4.000 12.166 28.835 0.422
Product Design 8.000 24.332 28.835 0.844
Programming 56.500 171.844 28.835 5.960
Test Planning 5.601 17.037 28.835 0.591
Verification and Validation 8.601 26.161 28.835 0.907
Project Office 5.899 17.941 28.835 0.622
CM/QA 6.399 19.461 28.835 0.675
Manuals 5.000 15.207 28.835 0.527

=========================================================================
=====
Life Cycle Phase Integration and Test

185
Life Cycle Effort 159.973 Person Months
Life Cycle Schedule 19.768 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 2.500 3.999 19.768 0.202
Product Design 5.000 7.999 19.768 0.405
Programming 39.406 63.038 19.768 3.189
Test Planning 3.101 4.961 19.768 0.251
Verification and Validation 28.196 45.106 19.768 2.282
Project Office 6.899 11.036 19.768 0.558
CM/QA 7.899 12.636 19.768 0.639
Manuals 7.000 11.198 19.768 0.566

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE MAINTENANCE INFORMATION


=============================================

MODULE Configuration/Requir
Development SLOC 78000
Development Nominal Person-Month 559.18
Development Adjusted Person-Month 559.18
Development Cost 0.00
Maintenance Labor Rate 0.00
Percentage Added 0.00%
Percentage Modified 0.00%
Maintenance Software Understanding 30.00
Unfamiliarity with the Software 0.40

Annual Change Traffic 0.00%

Effort Ajustment Factor 1.00

+ Product ++ Platform +
RELY DATA DOCU CPLX RUSE TIME STOR PVOL
Nominal Nominal Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0% 0% 0%
+ Personnel +
ACAP AEXP PCAP PEXP LTEX PCON
Nominal Nominal Nominal Nominal Nominal Nominal

186
0% 0% 0% 0% 0% 0%
+ Project + User +
TOOL SITE USR1 USR2
Nominal Nominal Nominal Nominal
0% 0% 0% 0%

WORK LOAD PROJECTION FOR THE NEXT 1 YEARS


Year KSLOC Effort_Nom Effort_Maint Staff KSLOC/Staff Cost
1 78.00 0.00 0.00 0.00 0.00 0.00

SUMMARY OF PROJECTION
Cumulative Maintenance Person-month 0.00
Overall Development and Maintenance Person-month 559.18
Cumulative Maintenance Cost 0.00
Overall Development and Maintenance Cost 0.00

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

+ Product ++ Platform +
MODULE NAME RELY DATA DOCU CPLX RUSE TIME STOR PVOL
N N N N N N N N
Communications 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 + Personnel ++
Project ++ User +
MODULE NAME ACAP AEXP PCAP PEXP LEXP PCON TOOL SCED SITE USR1
USR2

N N N N N N N N N N N
Communications 0% 0% 0% 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00

MODULE PHASE & ACTIVITY INFORMATION


========================================

Overall Phase Distribution


=========================================================================
=====
MODULE Communications
SLOC 62000
TOTAL EFFORT 444.478 Person Months
=========================================================================
=====

187
PCNT EFFORT (PM) PCNT SCHEDULE Staff
Plans And Requirements 7.000 31.113 22.406 14.959 2.080
Product Design 17.000 75.561 27.203 18.162 4.160
Programming 54.392 241.759 43.189 28.835 8.384
- Detailed Design 23.797 105.773 ---- ---- ----
- Code and Unit Test 30.594 135.985 ---- ---- ----
Integration and Test 28.608 127.158 29.608 19.768 6.433

=========================================================================
=====
Life Cycle Phase Plans And Requirements
Life Cycle Effort 31.113 Person Months
Life Cycle Schedule 14.959 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 44.797 13.938 14.959 0.932
Product Design 17.601 5.476 14.959 0.366
Programming 5.703 1.774 14.959 0.119
Test Planning 4.101 1.276 14.959 0.085
Verification and Validation 7.601 2.365 14.959 0.158
Project Office 12.297 3.826 14.959 0.256
CM/QA 2.899 0.902 14.959 0.060
Manuals 5.000 1.556 14.959 0.104

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE PHASE & ACTIVITY INFORMATION


========================================

=========================================================================
=====
Life Cycle Phase Product Design
Life Cycle Effort 75.561 Person Months
Life Cycle Schedule 18.162 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 12.500 9.445 18.162 0.520
Product Design 41.000 30.980 18.162 1.706
Programming 13.601 10.277 18.162 0.566

188
Test Planning 6.101 4.610 18.162 0.254
Verification and Validation 7.601 5.744 18.162 0.316
Project Office 9.797 7.403 18.162 0.408
CM/QA 2.399 1.812 18.162 0.100
Manuals 7.000 5.289 18.162 0.291

=========================================================================
=====
Life Cycle Phase Programming
Life Cycle Effort 241.759 Person Months
Life Cycle Schedule 28.835 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 4.000 9.670 28.835 0.335
Product Design 8.000 19.341 28.835 0.671
Programming 56.500 136.594 28.835 4.737
Test Planning 5.601 13.542 28.835 0.470
Verification and Validation 8.601 20.795 28.835 0.721
Project Office 5.899 14.260 28.835 0.495
CM/QA 6.399 15.469 28.835 0.536
Manuals 5.000 12.088 28.835 0.419

=========================================================================
=====
Life Cycle Phase Integration and Test
Life Cycle Effort 127.158 Person Months
Life Cycle Schedule 19.768 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 2.500 3.179 19.768 0.161
Product Design 5.000 6.358 19.768 0.322
Programming 39.406 50.107 19.768 2.535
Test Planning 3.101 3.944 19.768 0.200
Verification and Validation 28.196 35.853 19.768 1.814
Project Office 6.899 8.772 19.768 0.444
CM/QA 7.899 10.044 19.768 0.508
Manuals 7.000 8.901 19.768 0.450

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

189
MODULE MAINTENANCE INFORMATION
=============================================

MODULE Communications
Development SLOC 62000
Development Nominal Person-Month 444.48
Development Adjusted Person-Month 444.48
Development Cost 0.00
Maintenance Labor Rate 0.00
Percentage Added 0.00%
Percentage Modified 0.00%
Maintenance Software Understanding 30.00
Unfamiliarity with the Software 0.40

Annual Change Traffic 0.00%

Effort Ajustment Factor 1.00

+ Product ++ Platform +
RELY DATA DOCU CPLX RUSE TIME STOR PVOL
Nominal Nominal Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0% 0% 0%
+ Personnel +
ACAP AEXP PCAP PEXP LTEX PCON
Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0%
+ Project + User +
TOOL SITE USR1 USR2
Nominal Nominal Nominal Nominal
0% 0% 0% 0%

WORK LOAD PROJECTION FOR THE NEXT 1 YEARS


Year KSLOC Effort_Nom Effort_Maint Staff KSLOC/Staff Cost
1 62.00 0.00 0.00 0.00 0.00 0.00

SUMMARY OF PROJECTION
Cumulative Maintenance Person-month 0.00
Overall Development and Maintenance Person-month 444.48
Cumulative Maintenance Cost 0.00
Overall Development and Maintenance Cost 0.00

Printed at: Tue Nov 07 10:29:55 2006

190
Project Name: AWATS

+ Product ++ Platform +
MODULE NAME RELY DATA DOCU CPLX RUSE TIME STOR PVOL
N N N N N N N N
Graphics Presentatio 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 + Personnel ++
Project ++ User +
MODULE NAME ACAP AEXP PCAP PEXP LEXP PCON TOOL SCED SITE USR1
USR2

N N N N N N N N N N N
Graphics Presentatio 0% 0% 0% 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00

MODULE PHASE & ACTIVITY INFORMATION


========================================

Overall Phase Distribution


=========================================================================
=====
MODULE Graphics Presentatio
SLOC 110000
TOTAL EFFORT 788.590 Person Months
=========================================================================
=====
PCNT EFFORT (PM) PCNT SCHEDULE Staff
Plans And Requirements 7.000 55.201 22.406 14.959 3.690
Product Design 17.000 134.060 27.203 18.162 7.382
Programming 54.392 428.927 43.189 28.835 14.875
- Detailed Design 23.797 187.662 ---- ---- ----
- Code and Unit Test 30.594 241.265 ---- ---- ----
Integration and Test 28.608 225.603 29.608 19.768 11.413

=========================================================================
=====
Life Cycle Phase Plans And Requirements
Life Cycle Effort 55.201 Person Months
Life Cycle Schedule 14.959 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 44.797 24.729 14.959 1.653
Product Design 17.601 9.716 14.959 0.650
Programming 5.703 3.148 14.959 0.210

191
Test Planning 4.101 2.264 14.959 0.151
Verification and Validation 7.601 4.196 14.959 0.281
Project Office 12.297 6.788 14.959 0.454
CM/QA 2.899 1.600 14.959 0.107
Manuals 5.000 2.760 14.959 0.185

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE PHASE & ACTIVITY INFORMATION


========================================

=========================================================================
=====
Life Cycle Phase Product Design
Life Cycle Effort 134.060 Person Months
Life Cycle Schedule 18.162 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 12.500 16.758 18.162 0.923
Product Design 41.000 54.965 18.162 3.026
Programming 13.601 18.234 18.162 1.004
Test Planning 6.101 8.180 18.162 0.450
Verification and Validation 7.601 10.190 18.162 0.561
Project Office 9.797 13.134 18.162 0.723
CM/QA 2.399 3.216 18.162 0.177
Manuals 7.000 9.384 18.162 0.517

=========================================================================
=====
Life Cycle Phase Programming
Life Cycle Effort 428.927 Person Months
Life Cycle Schedule 28.835 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 4.000 17.157 28.835 0.595
Product Design 8.000 34.314 28.835 1.190
Programming 56.500 242.344 28.835 8.405
Test Planning 5.601 24.026 28.835 0.833
Verification and Validation 8.601 36.894 28.835 1.279

192
Project Office 5.899 25.301 28.835 0.877
CM/QA 6.399 27.445 28.835 0.952
Manuals 5.000 21.446 28.835 0.744

=========================================================================
=====
Life Cycle Phase Integration and Test
Life Cycle Effort 225.603 Person Months
Life Cycle Schedule 19.768 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 2.500 5.640 19.768 0.285
Product Design 5.000 11.280 19.768 0.571
Programming 39.406 88.900 19.768 4.497
Test Planning 3.101 6.997 19.768 0.354
Verification and Validation 28.196 63.611 19.768 3.218
Project Office 6.899 15.563 19.768 0.787
CM/QA 7.899 17.819 19.768 0.901
Manuals 7.000 15.792 19.768 0.799

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE MAINTENANCE INFORMATION


=============================================

MODULE Graphics Presentatio


Development SLOC 110000
Development Nominal Person-Month 788.59
Development Adjusted Person-Month 788.59
Development Cost 0.00
Maintenance Labor Rate 0.00
Percentage Added 0.00%
Percentage Modified 0.00%
Maintenance Software Understanding 30.00
Unfamiliarity with the Software 0.40

Annual Change Traffic 0.00%

Effort Ajustment Factor 1.00

193
+ Product ++ Platform +
RELY DATA DOCU CPLX RUSE TIME STOR PVOL
Nominal Nominal Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0% 0% 0%
+ Personnel +
ACAP AEXP PCAP PEXP LTEX PCON
Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0%
+ Project + User +
TOOL SITE USR1 USR2
Nominal Nominal Nominal Nominal
0% 0% 0% 0%

WORK LOAD PROJECTION FOR THE NEXT 1 YEARS


Year KSLOC Effort_Nom Effort_Maint Staff KSLOC/Staff Cost
1 110.00 0.00 0.00 0.00 0.00 0.00

SUMMARY OF PROJECTION
Cumulative Maintenance Person-month 0.00
Overall Development and Maintenance Person-month 788.59
Cumulative Maintenance Cost 0.00
Overall Development and Maintenance Cost 0.00

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

+ Product ++ Platform +
MODULE NAME RELY DATA DOCU CPLX RUSE TIME STOR PVOL
N N N N N N N N
Word Processing 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 + Personnel ++
Project ++ User +
MODULE NAME ACAP AEXP PCAP PEXP LEXP PCON TOOL SCED SITE USR1
USR2

N N N N N N N N N N N
Word Processing 0% 0% 0% 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00

MODULE PHASE & ACTIVITY INFORMATION


========================================

194
Overall Phase Distribution
=========================================================================
=====
MODULE Word Processing
SLOC 65000
TOTAL EFFORT 465.985 Person Months
=========================================================================
=====
PCNT EFFORT (PM) PCNT SCHEDULE Staff
Plans And Requirements 7.000 32.619 22.406 14.959 2.181
Product Design 17.000 79.217 27.203 18.162 4.362
Programming 54.392 253.457 43.189 28.835 8.790
- Detailed Design 23.797 110.891 ---- ---- ----
- Code and Unit Test 30.594 142.565 ---- ---- ----
Integration and Test 28.608 133.311 29.608 19.768 6.744

=========================================================================
=====
Life Cycle Phase Plans And Requirements
Life Cycle Effort 32.619 Person Months
Life Cycle Schedule 14.959 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 44.797 14.612 14.959 0.977
Product Design 17.601 5.741 14.959 0.384
Programming 5.703 1.860 14.959 0.124
Test Planning 4.101 1.338 14.959 0.089
Verification and Validation 7.601 2.479 14.959 0.166
Project Office 12.297 4.011 14.959 0.268
CM/QA 2.899 0.945 14.959 0.063
Manuals 5.000 1.631 14.959 0.109

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE PHASE & ACTIVITY INFORMATION


========================================

=========================================================================
=====
Life Cycle Phase Product Design

195
Life Cycle Effort 79.217 Person Months
Life Cycle Schedule 18.162 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 12.500 9.902 18.162 0.545
Product Design 41.000 32.479 18.162 1.788
Programming 13.601 10.775 18.162 0.593
Test Planning 6.101 4.833 18.162 0.266
Verification and Validation 7.601 6.022 18.162 0.332
Project Office 9.797 7.761 18.162 0.427
CM/QA 2.399 1.900 18.162 0.105
Manuals 7.000 5.545 18.162 0.305

=========================================================================
=====
Life Cycle Phase Programming
Life Cycle Effort 253.457 Person Months
Life Cycle Schedule 28.835 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 4.000 10.138 28.835 0.352
Product Design 8.000 20.277 28.835 0.703
Programming 56.500 143.203 28.835 4.966
Test Planning 5.601 14.197 28.835 0.492
Verification and Validation 8.601 21.801 28.835 0.756
Project Office 5.899 14.950 28.835 0.518
CM/QA 6.399 16.218 28.835 0.562
Manuals 5.000 12.673 28.835 0.440

=========================================================================
=====
Life Cycle Phase Integration and Test
Life Cycle Effort 133.311 Person Months
Life Cycle Schedule 19.768 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 2.500 3.333 19.768 0.169
Product Design 5.000 6.666 19.768 0.337
Programming 39.406 52.532 19.768 2.657
Test Planning 3.101 4.134 19.768 0.209
Verification and Validation 28.196 37.588 19.768 1.901
Project Office 6.899 9.197 19.768 0.465
CM/QA 7.899 10.530 19.768 0.533

196
Manuals 7.000 9.332 19.768 0.472

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE MAINTENANCE INFORMATION


=============================================

MODULE Word Processing


Development SLOC 65000
Development Nominal Person-Month 465.99
Development Adjusted Person-Month 465.99
Development Cost 0.00
Maintenance Labor Rate 0.00
Percentage Added 0.00%
Percentage Modified 0.00%
Maintenance Software Understanding 30.00
Unfamiliarity with the Software 0.40

Annual Change Traffic 0.00%

Effort Ajustment Factor 1.00

+ Product ++ Platform +
RELY DATA DOCU CPLX RUSE TIME STOR PVOL
Nominal Nominal Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0% 0% 0%
+ Personnel +
ACAP AEXP PCAP PEXP LTEX PCON
Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0%
+ Project + User +
TOOL SITE USR1 USR2
Nominal Nominal Nominal Nominal
0% 0% 0% 0%

WORK LOAD PROJECTION FOR THE NEXT 1 YEARS


Year KSLOC Effort_Nom Effort_Maint Staff KSLOC/Staff Cost
1 65.00 0.00 0.00 0.00 0.00 0.00

SUMMARY OF PROJECTION

197
Cumulative Maintenance Person-month 0.00
Overall Development and Maintenance Person-month 465.99
Cumulative Maintenance Cost 0.00
Overall Development and Maintenance Cost 0.00

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

+ Product ++ Platform +
MODULE NAME RELY DATA DOCU CPLX RUSE TIME STOR PVOL
N N N N N N N N
Project Management 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 + Personnel ++
Project ++ User +
MODULE NAME ACAP AEXP PCAP PEXP LEXP PCON TOOL SCED SITE USR1
USR2

N N N N N N N N N N N
Project Management 0% 0% 0% 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00

MODULE PHASE & ACTIVITY INFORMATION


========================================

Overall Phase Distribution


=========================================================================
=====
MODULE Project Management
SLOC 56000
TOTAL EFFORT 401.464 Person Months
=========================================================================
=====
PCNT EFFORT (PM) PCNT SCHEDULE Staff
Plans And Requirements 7.000 28.102 22.406 14.959 1.879
Product Design 17.000 68.249 27.203 18.162 3.758
Programming 54.392 218.363 43.189 28.835 7.573
- Detailed Design 23.797 95.537 ---- ---- ----
- Code and Unit Test 30.594 122.826 ---- ---- ----
Integration and Test 28.608 114.852 29.608 19.768 5.810

=========================================================================
=====

198
Life Cycle Phase Plans And Requirements
Life Cycle Effort 28.102 Person Months
Life Cycle Schedule 14.959 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 44.797 12.589 14.959 0.842
Product Design 17.601 4.946 14.959 0.331
Programming 5.703 1.603 14.959 0.107
Test Planning 4.101 1.153 14.959 0.077
Verification and Validation 7.601 2.136 14.959 0.143
Project Office 12.297 3.456 14.959 0.231
CM/QA 2.899 0.815 14.959 0.054
Manuals 5.000 1.405 14.959 0.094

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE PHASE & ACTIVITY INFORMATION


========================================

=========================================================================
=====
Life Cycle Phase Product Design
Life Cycle Effort 68.249 Person Months
Life Cycle Schedule 18.162 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 12.500 8.531 18.162 0.470
Product Design 41.000 27.982 18.162 1.541
Programming 13.601 9.283 18.162 0.511
Test Planning 6.101 4.164 18.162 0.229
Verification and Validation 7.601 5.188 18.162 0.286
Project Office 9.797 6.686 18.162 0.368
CM/QA 2.399 1.637 18.162 0.090
Manuals 7.000 4.777 18.162 0.263

=========================================================================
=====
Life Cycle Phase Programming
Life Cycle Effort 218.363 Person Months

199
Life Cycle Schedule 28.835 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 4.000 8.735 28.835 0.303
Product Design 8.000 17.469 28.835 0.606
Programming 56.500 123.375 28.835 4.279
Test Planning 5.601 12.231 28.835 0.424
Verification and Validation 8.601 18.782 28.835 0.651
Project Office 5.899 12.880 28.835 0.447
CM/QA 6.399 13.972 28.835 0.485
Manuals 5.000 10.918 28.835 0.379

=========================================================================
=====
Life Cycle Phase Integration and Test
Life Cycle Effort 114.852 Person Months
Life Cycle Schedule 19.768 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 2.500 2.871 19.768 0.145
Product Design 5.000 5.743 19.768 0.291
Programming 39.406 45.258 19.768 2.290
Test Planning 3.101 3.562 19.768 0.180
Verification and Validation 28.196 32.384 19.768 1.638
Project Office 6.899 7.923 19.768 0.401
CM/QA 7.899 9.072 19.768 0.459
Manuals 7.000 8.040 19.768 0.407

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE MAINTENANCE INFORMATION


=============================================

MODULE Project Management


Development SLOC 56000
Development Nominal Person-Month 401.46
Development Adjusted Person-Month 401.46
Development Cost 0.00
Maintenance Labor Rate 0.00

200
Percentage Added 0.00%
Percentage Modified 0.00%
Maintenance Software Understanding 30.00
Unfamiliarity with the Software 0.40

Annual Change Traffic 0.00%

Effort Ajustment Factor 1.00

+ Product ++ Platform +
RELY DATA DOCU CPLX RUSE TIME STOR PVOL
Nominal Nominal Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0% 0% 0%
+ Personnel +
ACAP AEXP PCAP PEXP LTEX PCON
Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0%
+ Project + User +
TOOL SITE USR1 USR2
Nominal Nominal Nominal Nominal
0% 0% 0% 0%

WORK LOAD PROJECTION FOR THE NEXT 1 YEARS


Year KSLOC Effort_Nom Effort_Maint Staff KSLOC/Staff Cost
1 56.00 0.00 0.00 0.00 0.00 0.00

SUMMARY OF PROJECTION
Cumulative Maintenance Person-month 0.00
Overall Development and Maintenance Person-month 401.46
Cumulative Maintenance Cost 0.00
Overall Development and Maintenance Cost 0.00

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

+ Product ++ Platform +
MODULE NAME RELY DATA DOCU CPLX RUSE TIME STOR PVOL
N N N N N N N N
GPS Navigation 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 + Personnel ++
Project ++ User +

201
MODULE NAME ACAP AEXP PCAP PEXP LEXP PCON TOOL SCED SITE USR1
USR2

N N N N N N N N N N N
GPS Navigation 0% 0% 0% 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00

MODULE PHASE & ACTIVITY INFORMATION


========================================

Overall Phase Distribution


=========================================================================
=====
MODULE GPS Navigation
SLOC 74000
TOTAL EFFORT 530.506 Person Months
=========================================================================
=====
PCNT EFFORT (PM) PCNT SCHEDULE Staff
Plans And Requirements 7.000 37.135 22.406 14.959 2.483
Product Design 17.000 90.186 27.203 18.162 4.966
Programming 54.392 288.551 43.189 28.835 10.007
- Detailed Design 23.797 126.246 ---- ---- ----
- Code and Unit Test 30.594 162.305 ---- ---- ----
Integration and Test 28.608 151.769 29.608 19.768 7.678

=========================================================================
=====
Life Cycle Phase Plans And Requirements
Life Cycle Effort 37.135 Person Months
Life Cycle Schedule 14.959 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 44.797 16.636 14.959 1.112
Product Design 17.601 6.536 14.959 0.437
Programming 5.703 2.118 14.959 0.142
Test Planning 4.101 1.523 14.959 0.102
Verification and Validation 7.601 2.823 14.959 0.189
Project Office 12.297 4.567 14.959 0.305
CM/QA 2.899 1.076 14.959 0.072
Manuals 5.000 1.857 14.959 0.124

202
Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE PHASE & ACTIVITY INFORMATION


========================================

=========================================================================
=====
Life Cycle Phase Product Design
Life Cycle Effort 90.186 Person Months
Life Cycle Schedule 18.162 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 12.500 11.273 18.162 0.621
Product Design 41.000 36.976 18.162 2.036
Programming 13.601 12.267 18.162 0.675
Test Planning 6.101 5.503 18.162 0.303
Verification and Validation 7.601 6.855 18.162 0.377
Project Office 9.797 8.836 18.162 0.487
CM/QA 2.399 2.163 18.162 0.119
Manuals 7.000 6.313 18.162 0.348

=========================================================================
=====
Life Cycle Phase Programming
Life Cycle Effort 288.551 Person Months
Life Cycle Schedule 28.835 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 4.000 11.542 28.835 0.400
Product Design 8.000 23.084 28.835 0.801
Programming 56.500 163.031 28.835 5.654
Test Planning 5.601 16.163 28.835 0.561
Verification and Validation 8.601 24.819 28.835 0.861
Project Office 5.899 17.020 28.835 0.590
CM/QA 6.399 18.463 28.835 0.640
Manuals 5.000 14.428 28.835 0.500

=========================================================================
=====
Life Cycle Phase Integration and Test

203
Life Cycle Effort 151.769 Person Months
Life Cycle Schedule 19.768 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 2.500 3.794 19.768 0.192
Product Design 5.000 7.588 19.768 0.384
Programming 39.406 59.806 19.768 3.025
Test Planning 3.101 4.707 19.768 0.238
Verification and Validation 28.196 42.793 19.768 2.165
Project Office 6.899 10.470 19.768 0.530
CM/QA 7.899 11.988 19.768 0.606
Manuals 7.000 10.624 19.768 0.537

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE MAINTENANCE INFORMATION


=============================================

MODULE GPS Navigation


Development SLOC 74000
Development Nominal Person-Month 530.51
Development Adjusted Person-Month 530.51
Development Cost 0.00
Maintenance Labor Rate 0.00
Percentage Added 0.00%
Percentage Modified 0.00%
Maintenance Software Understanding 30.00
Unfamiliarity with the Software 0.40

Annual Change Traffic 0.00%

Effort Ajustment Factor 1.00

+ Product ++ Platform +
RELY DATA DOCU CPLX RUSE TIME STOR PVOL
Nominal Nominal Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0% 0% 0%
+ Personnel +
ACAP AEXP PCAP PEXP LTEX PCON

204
Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0%
+ Project + User +
TOOL SITE USR1 USR2
Nominal Nominal Nominal Nominal
0% 0% 0% 0%

WORK LOAD PROJECTION FOR THE NEXT 1 YEARS


Year KSLOC Effort_Nom Effort_Maint Staff KSLOC/Staff Cost
1 74.00 0.00 0.00 0.00 0.00 0.00

SUMMARY OF PROJECTION
Cumulative Maintenance Person-month 0.00
Overall Development and Maintenance Person-month 530.51
Cumulative Maintenance Cost 0.00
Overall Development and Maintenance Cost 0.00

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

+ Product ++ Platform +
MODULE NAME RELY DATA DOCU CPLX RUSE TIME STOR PVOL
N N N N N N N N
Compile/Link/Runtime 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 + Personnel ++
Project ++ User +
MODULE NAME ACAP AEXP PCAP PEXP LEXP PCON TOOL SCED SITE USR1
USR2

N N N N N N N N N N N
Compile/Link/Runtime 0% 0% 0% 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00

MODULE PHASE & ACTIVITY INFORMATION


========================================

Overall Phase Distribution


=========================================================================
=====
MODULE Compile/Link/Runtime
SLOC 56000
TOTAL EFFORT 401.464 Person Months

205
=========================================================================
=====
PCNT EFFORT (PM) PCNT SCHEDULE Staff
Plans And Requirements 7.000 28.102 22.406 14.959 1.879
Product Design 17.000 68.249 27.203 18.162 3.758
Programming 54.392 218.363 43.189 28.835 7.573
- Detailed Design 23.797 95.537 ---- ---- ----
- Code and Unit Test 30.594 122.826 ---- ---- ----
Integration and Test 28.608 114.852 29.608 19.768 5.810

=========================================================================
=====
Life Cycle Phase Plans And Requirements
Life Cycle Effort 28.102 Person Months
Life Cycle Schedule 14.959 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 44.797 12.589 14.959 0.842
Product Design 17.601 4.946 14.959 0.331
Programming 5.703 1.603 14.959 0.107
Test Planning 4.101 1.153 14.959 0.077
Verification and Validation 7.601 2.136 14.959 0.143
Project Office 12.297 3.456 14.959 0.231
CM/QA 2.899 0.815 14.959 0.054
Manuals 5.000 1.405 14.959 0.094

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE PHASE & ACTIVITY INFORMATION


========================================

=========================================================================
=====
Life Cycle Phase Product Design
Life Cycle Effort 68.249 Person Months
Life Cycle Schedule 18.162 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff

206
Requirements Analysis 12.500 8.531 18.162 0.470
Product Design 41.000 27.982 18.162 1.541
Programming 13.601 9.283 18.162 0.511
Test Planning 6.101 4.164 18.162 0.229
Verification and Validation 7.601 5.188 18.162 0.286
Project Office 9.797 6.686 18.162 0.368
CM/QA 2.399 1.637 18.162 0.090
Manuals 7.000 4.777 18.162 0.263

=========================================================================
=====
Life Cycle Phase Programming
Life Cycle Effort 218.363 Person Months
Life Cycle Schedule 28.835 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 4.000 8.735 28.835 0.303
Product Design 8.000 17.469 28.835 0.606
Programming 56.500 123.375 28.835 4.279
Test Planning 5.601 12.231 28.835 0.424
Verification and Validation 8.601 18.782 28.835 0.651
Project Office 5.899 12.880 28.835 0.447
CM/QA 6.399 13.972 28.835 0.485
Manuals 5.000 10.918 28.835 0.379

=========================================================================
=====
Life Cycle Phase Integration and Test
Life Cycle Effort 114.852 Person Months
Life Cycle Schedule 19.768 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 2.500 2.871 19.768 0.145
Product Design 5.000 5.743 19.768 0.291
Programming 39.406 45.258 19.768 2.290
Test Planning 3.101 3.562 19.768 0.180
Verification and Validation 28.196 32.384 19.768 1.638
Project Office 6.899 7.923 19.768 0.401
CM/QA 7.899 9.072 19.768 0.459
Manuals 7.000 8.040 19.768 0.407

207
Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE MAINTENANCE INFORMATION


=============================================

MODULE Compile/Link/Runtime
Development SLOC 56000
Development Nominal Person-Month 401.46
Development Adjusted Person-Month 401.46
Development Cost 0.00
Maintenance Labor Rate 0.00
Percentage Added 0.00%
Percentage Modified 0.00%
Maintenance Software Understanding 30.00
Unfamiliarity with the Software 0.40

Annual Change Traffic 0.00%

Effort Ajustment Factor 1.00

+ Product ++ Platform +
RELY DATA DOCU CPLX RUSE TIME STOR PVOL
Nominal Nominal Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0% 0% 0%
+ Personnel +
ACAP AEXP PCAP PEXP LTEX PCON
Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0%
+ Project + User +
TOOL SITE USR1 USR2
Nominal Nominal Nominal Nominal
0% 0% 0% 0%

WORK LOAD PROJECTION FOR THE NEXT 1 YEARS


Year KSLOC Effort_Nom Effort_Maint Staff KSLOC/Staff Cost
1 56.00 0.00 0.00 0.00 0.00 0.00

SUMMARY OF PROJECTION
Cumulative Maintenance Person-month 0.00
Overall Development and Maintenance Person-month 401.46

208
Cumulative Maintenance Cost 0.00
Overall Development and Maintenance Cost 0.00

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

+ Product ++ Platform +
MODULE NAME RELY DATA DOCU CPLX RUSE TIME STOR PVOL
N N N N N N N N
Debugging/Testing 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 + Personnel ++
Project ++ User +
MODULE NAME ACAP AEXP PCAP PEXP LEXP PCON TOOL SCED SITE USR1
USR2

N N N N N N N N N N N
Debugging/Testing 0% 0% 0% 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00

MODULE PHASE & ACTIVITY INFORMATION


========================================

Overall Phase Distribution


=========================================================================
=====
MODULE Debugging/Testing
SLOC 65000
TOTAL EFFORT 465.985 Person Months
=========================================================================
=====
PCNT EFFORT (PM) PCNT SCHEDULE Staff
Plans And Requirements 7.000 32.619 22.406 14.959 2.181
Product Design 17.000 79.217 27.203 18.162 4.362
Programming 54.392 253.457 43.189 28.835 8.790
- Detailed Design 23.797 110.891 ---- ---- ----
- Code and Unit Test 30.594 142.565 ---- ---- ----
Integration and Test 28.608 133.311 29.608 19.768 6.744

=========================================================================
=====
Life Cycle Phase Plans And Requirements
Life Cycle Effort 32.619 Person Months

209
Life Cycle Schedule 14.959 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 44.797 14.612 14.959 0.977
Product Design 17.601 5.741 14.959 0.384
Programming 5.703 1.860 14.959 0.124
Test Planning 4.101 1.338 14.959 0.089
Verification and Validation 7.601 2.479 14.959 0.166
Project Office 12.297 4.011 14.959 0.268
CM/QA 2.899 0.945 14.959 0.063
Manuals 5.000 1.631 14.959 0.109

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE PHASE & ACTIVITY INFORMATION


========================================

=========================================================================
=====
Life Cycle Phase Product Design
Life Cycle Effort 79.217 Person Months
Life Cycle Schedule 18.162 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 12.500 9.902 18.162 0.545
Product Design 41.000 32.479 18.162 1.788
Programming 13.601 10.775 18.162 0.593
Test Planning 6.101 4.833 18.162 0.266
Verification and Validation 7.601 6.022 18.162 0.332
Project Office 9.797 7.761 18.162 0.427
CM/QA 2.399 1.900 18.162 0.105
Manuals 7.000 5.545 18.162 0.305

=========================================================================
=====
Life Cycle Phase Programming
Life Cycle Effort 253.457 Person Months
Life Cycle Schedule 28.835 Months

210
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 4.000 10.138 28.835 0.352
Product Design 8.000 20.277 28.835 0.703
Programming 56.500 143.203 28.835 4.966
Test Planning 5.601 14.197 28.835 0.492
Verification and Validation 8.601 21.801 28.835 0.756
Project Office 5.899 14.950 28.835 0.518
CM/QA 6.399 16.218 28.835 0.562
Manuals 5.000 12.673 28.835 0.440

=========================================================================
=====
Life Cycle Phase Integration and Test
Life Cycle Effort 133.311 Person Months
Life Cycle Schedule 19.768 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 2.500 3.333 19.768 0.169
Product Design 5.000 6.666 19.768 0.337
Programming 39.406 52.532 19.768 2.657
Test Planning 3.101 4.134 19.768 0.209
Verification and Validation 28.196 37.588 19.768 1.901
Project Office 6.899 9.197 19.768 0.465
CM/QA 7.899 10.530 19.768 0.533
Manuals 7.000 9.332 19.768 0.472

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE MAINTENANCE INFORMATION


=============================================

MODULE Debugging/Testing
Development SLOC 65000
Development Nominal Person-Month 465.99
Development Adjusted Person-Month 465.99
Development Cost 0.00
Maintenance Labor Rate 0.00
Percentage Added 0.00%

211
Percentage Modified 0.00%
Maintenance Software Understanding 30.00
Unfamiliarity with the Software 0.40

Annual Change Traffic 0.00%

Effort Ajustment Factor 1.00

+ Product ++ Platform +
RELY DATA DOCU CPLX RUSE TIME STOR PVOL
Nominal Nominal Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0% 0% 0%
+ Personnel +
ACAP AEXP PCAP PEXP LTEX PCON
Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0%
+ Project + User +
TOOL SITE USR1 USR2
Nominal Nominal Nominal Nominal
0% 0% 0% 0%

WORK LOAD PROJECTION FOR THE NEXT 1 YEARS


Year KSLOC Effort_Nom Effort_Maint Staff KSLOC/Staff Cost
1 65.00 0.00 0.00 0.00 0.00 0.00

SUMMARY OF PROJECTION
Cumulative Maintenance Person-month 0.00
Overall Development and Maintenance Person-month 465.99
Cumulative Maintenance Cost 0.00
Overall Development and Maintenance Cost 0.00

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

+ Product ++ Platform +
MODULE NAME RELY DATA DOCU CPLX RUSE TIME STOR PVOL
N N N N N N N N
Electronic Inventory 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 + Personnel ++
Project ++ User +
MODULE NAME ACAP AEXP PCAP PEXP LEXP PCON TOOL SCED SITE USR1
USR2

212
N N N N N N N N N N N
Electronic Inventory 0% 0% 0% 0% 0% 0% 0% 0% 0% 0% 0%
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00

MODULE PHASE & ACTIVITY INFORMATION


========================================

Overall Phase Distribution


=========================================================================
=====
MODULE Electronic Inventory
SLOC 365000
TOTAL EFFORT 2616.686 Person Months
=========================================================================
=====
PCNT EFFORT (PM) PCNT SCHEDULE Staff
Plans And Requirements 7.000 183.168 22.406 14.959 12.245
Product Design 17.000 444.837 27.203 18.162 24.493
Programming 54.392 1423.258 43.189 28.835 49.359
- Detailed Design 23.797 622.698 ---- ---- ----
- Code and Unit Test 30.594 800.560 ---- ---- ----
Integration and Test 28.608 748.591 29.608 19.768 37.869

=========================================================================
=====
Life Cycle Phase Plans And Requirements
Life Cycle Effort 183.168 Person Months
Life Cycle Schedule 14.959 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 44.797 82.054 14.959 5.485
Product Design 17.601 32.240 14.959 2.155
Programming 5.703 10.446 14.959 0.698
Test Planning 4.101 7.512 14.959 0.502
Verification and Validation 7.601 13.923 14.959 0.931
Project Office 12.297 22.525 14.959 1.506
CM/QA 2.899 5.309 14.959 0.355
Manuals 5.000 9.158 14.959 0.612

213
Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE PHASE & ACTIVITY INFORMATION


========================================

=========================================================================
=====
Life Cycle Phase Product Design
Life Cycle Effort 444.837 Person Months
Life Cycle Schedule 18.162 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 12.500 55.605 18.162 3.062
Product Design 41.000 182.383 18.162 10.042
Programming 13.601 60.504 18.162 3.331
Test Planning 6.101 27.141 18.162 1.494
Verification and Validation 7.601 33.814 18.162 1.862
Project Office 9.797 43.582 18.162 2.400
CM/QA 2.399 10.670 18.162 0.587
Manuals 7.000 31.139 18.162 1.715

=========================================================================
=====
Life Cycle Phase Programming
Life Cycle Effort 1423.258 Person Months
Life Cycle Schedule 28.835 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 4.000 56.930 28.835 1.974
Product Design 8.000 113.861 28.835 3.949
Programming 56.500 804.141 28.835 27.888
Test Planning 5.601 79.722 28.835 2.765
Verification and Validation 8.601 122.420 28.835 4.246
Project Office 5.899 83.952 28.835 2.912
CM/QA 6.399 91.069 28.835 3.158
Manuals 5.000 71.163 28.835 2.468

=========================================================================
=====
Life Cycle Phase Integration and Test

214
Life Cycle Effort 748.591 Person Months
Life Cycle Schedule 19.768 Months
=========================================================================
=====
PCNT EFFORT (PM) SCHEDULE Staff
Requirements Analysis 2.500 18.715 19.768 0.947
Product Design 5.000 37.430 19.768 1.893
Programming 39.406 294.987 19.768 14.923
Test Planning 3.101 23.217 19.768 1.174
Verification and Validation 28.196 211.071 19.768 10.678
Project Office 6.899 51.642 19.768 2.612
CM/QA 7.899 59.128 19.768 2.991
Manuals 7.000 52.401 19.768 2.651

Printed at: Tue Nov 07 10:29:55 2006

Project Name: AWATS

MODULE MAINTENANCE INFORMATION


=============================================

MODULE Electronic Inventory


Development SLOC 365000
Development Nominal Person-Month 2616.69
Development Adjusted Person-Month 2616.69
Development Cost 0.00
Maintenance Labor Rate 0.00
Percentage Added 0.00%
Percentage Modified 0.00%
Maintenance Software Understanding 30.00
Unfamiliarity with the Software 0.40

Annual Change Traffic 0.00%

Effort Ajustment Factor 1.00

+ Product ++ Platform +
RELY DATA DOCU CPLX RUSE TIME STOR PVOL
Nominal Nominal Nominal Nominal Nominal Nominal Nominal Nominal
0% 0% 0% 0% 0% 0% 0% 0%
+ Personnel +
ACAP AEXP PCAP PEXP LTEX PCON
Nominal Nominal Nominal Nominal Nominal Nominal

215
0% 0% 0% 0% 0% 0%
+ Project + User +
TOOL SITE USR1 USR2
Nominal Nominal Nominal Nominal
0% 0% 0% 0%

WORK LOAD PROJECTION FOR THE NEXT 1 YEARS


Year KSLOC Effort_Nom Effort_Maint Staff KSLOC/Staff Cost
1 365.00 0.00 0.00 0.00 0.00 0.00

SUMMARY OF PROJECTION
Cumulative Maintenance Person-month 0.00
Overall Development and Maintenance Person-month 2616.69
Cumulative Maintenance Cost 0.00
Overall Development and Maintenance Cost 0.00

216
Index

A E
Abstract, viii, xiv Electronic Inventory and Tracking Package, vi, 2, 16
Alpha Test, 32 End-User Profile, 8
Appendix A, 149 ER Diagrams, 96
Appendix B, 150 EZAntivirus, 98
Appendix C, 152
Appendix D, 153
Appendix E, 154
F
Appendix F, 155 Firewire connector, 9
Appendix G, 157 Framework, iii, 9, 97, 98
Appendix H, 158
Appendix I, 159
Appendix J, 161 G
armory, vi, 1, 6, 7, 8, 10, 11, 12, 104 General Services Administration, 6
AWATS, i, vi, vii, viii, x, xii, xiii, xiv, xvii, xix, 1, 2, 6, 7, 8, GPS Navigation Package, vi, xiii, xix, 2, 7, 8, 12, 15, 62, 76,
10, 12, 13, 15, 18, 23, 24, 25, 28, 29, 31, 32, 33, 35, 36, 38, 79, 112, 129, 130, 131
39, 54, 55, 56, 57, 58, 59, 60, 61, 62, 67, 68, 69, 70, 77, 81, Graphical Presentation Package, vii, 2, 10
100, 101, 102, 103, 104, 106, 118, 124, 128, 133, 149, 150,
152, 157, 161, 163, 164, 167, 168, 171, 173, 174, 175, 176,
177, 178, 179, 181, 182, 183, 185, 186, 187, 188, 189, 190, H
192, 193, 194, 196, 197, 198, 199, 200, 202, 203, 204, 206,
Hardware Costs, ix, xv, xvii, 152
207, 208, 209, 212, 213, 214, 215, 216, 219, 220
Help Desk, iii, xv, 41, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53,
Automated Weapon Accountability and Tracking System,
65, 67, 72, 99, 101, 150, 156
vi, viii, x
Helpdesk Costs, ix, xv, xvii, 153
Henry
B Matt Henry. See
Hoodwick, i
Beta Test, 32
Budget, 59
I
C IEEE 1028, 97
IEEE 1298, 97
C#, 19 IEEE 730, 97
Ciarleglio Integration & Test, 31
Lina Ciarleglio. See inventory, vi, xii, 1, 2, 6, 11, 12, 55
COCOMO, xvi, 20, 129, 130, 172, 173
Communications Package, vi, xx, 2, 75, 78, 114
Computer Architecture Diagram, iv, xix, 4 J
COTS, x, xix, 1, 7, 8, 10, 13, 15, 17, 28, 31, 60, 68, 96, 102,
James "Jimmy" Doolittle, x
107, 108, 109, 118, 124, 132, 133, 163
JAVA, xiii, 20, 97
Custom Code, 13
JTC 1/SC 17, 18
JTC 1/SC 31, 18
D
Dam K
Tuyen Dam. See
Ken Nidiffer, i
Database Package, vi, xiii, xix, 1, 7, 12, 62, 74, 77, 107, 129,
130, 131 Kernel, 13, 20, 58, 123, 138
Debugging/Testing Package, vii, 2, 76, 80
Documentation Costs, ix, xv, xvii, 150, 151 L
Doolittle
Doolittle. See Leonard Woody III, i
Lina Ciarleglio, i
List of Figures, xiv, xix

217
List of Tables, xiv, xvii Software Package Costs, ix, xv, xvii, 154
Software Tool Costs, ix, xv, xviii, 155, 156
Spreadsheet Package, vi, xiii, xix, 1, 7, 62, 74, 78, 110, 129,
M 130, 131
M-14, 6 System Architecture Diagram, iii, xix, 3, 149
M-16A2, 6 System Kernel Software, ix
M-4, 6 System XYZ software, 57
Master Schedule, 62
Matt Henry, i, xx, 142 T
Table of Contents, xiv
N Total Cost, ix, xiv, 151, 153, 155, 156
Nidiffer Tuyen Dam, i
Ken Nidiffer. See
NUnit, 97 U
Unit Testing, 14, 23, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53,
O 97, 119, 120, 121, 122, 123, 134, 135, 136, 137, 138
Outsourcing, 13 United States Marine Corps
USMC. See
Use Case Diagram, xix, 5, 149
P
Preface, vi, xiv V
Process Model, 25
Product Summary, ii, xiv, 6 Verification and Validation, iv, xv, 97, 102, 103, 128, 159, 160,
Progress Metrics, 68 176, 177, 179, 180, 181, 183, 184, 187, 188, 190, 191, 192,
Project Charter, x, xiv, 28 194, 195, 198, 199, 201, 202, 203, 205, 206, 207, 209, 210,
Project Cost, ix, 67 211, 214, 215, 217, 219, 220
Project Management Package, vi, xix, 1, 109
Project Scope, ii, xii, xiv W
Project Summary, ii, xiv, 12
Waterfall Model, 23
Waterfall software development, 25
Q Woody
Quality Assurance, iv, xv, 13, 21, 42, 60, 67, 97, 102, 103, 118, Leonard Woody III. See
133 Word Processing Package, vi, xiii, xx, 2, 7, 8, 12, 15, 62, 75,
Quality Assurance (QA) Specialist, 42 79, 115, 129, 130, 131
Work Breakdown Structure, v, xvi, xix, xx, 13, 23, 105, 106,
107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 161
R
Rational Rose, 96 X
Requirement Management Package, vi, xix, 1, 111
Retaining, 71 XYZ software, vi, x
Reuse, xix, 1, 2, 13, 15, 17, 22, 64, 96, 97, 99, 110, 111, 112, System XYZ software. See
132
RFID, iii, x, xii, xiii, xiv, xv, xvii, 1, 2, 7, 9, 10, 18, 24, 32, 36, Y
41, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 56, 57, 59, 62,
73, 77, 81, 104, 120, 121, 122, 123, 124, 135, 136, 138, 152, Year, 72
161, 162, 163, 167, 168, 169, 170
Risk, iv, xv, xvii, 10, 18, 22, 57, 58, 59, 99
Runtime Package, vi, xix, 1, 15, 76, 80, 108
Z
Zachman Enterprise Architecture Framework, xv, xvii, 149
S ZIP drive, xii

Secure Sockets Layer (SSL), 8

218

You might also like