0% found this document useful (0 votes)
92 views

Software Process and Project Management

The document discusses different levels of software development processes, from the initial ad hoc process to the repeatable process with commitment control to the defined process with established procedures. It provides examples of characteristics and risks at each level and recommends actions like establishing a process group and defining the development process architecture to advance from the repeatable to the defined level.

Uploaded by

Divya Gannerla
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
92 views

Software Process and Project Management

The document discusses different levels of software development processes, from the initial ad hoc process to the repeatable process with commitment control to the defined process with established procedures. It provides examples of characteristics and risks at each level and recommends actions like establishing a process group and defining the development process architecture to advance from the repeatable to the defined level.

Uploaded by

Divya Gannerla
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 27

Software Process and Project Management

The Initial Process:


 The Initial Process could properly be called ad hoc, and it is often even chaotic.
 Here, the organization typically operates without formalized procedures, cost estimates,
and project plans.
 Tools are neither well integrated with the process nor uniformly applied.
 Change control is lax and there is little senior management exposure to or understanding
of the problems and issues.
 Since problems are often deferred or even forgotten, software installation and maintenance
often present serious problems.
 While organizations at this level may have formal procedures for project control, there is
no management mechanism to ensure they are used.
 The best test is to observe how such an organization behaves in a crisis.
 If it abandons established procedures and reverts to merely coding and testing, it is likely
to be at the Initial Process level.
 After all, if the techniques and methods are appropriate, they must be used in a crisis and
if they are not appropriate, they should not be used at all.
 One reason organizations behave chaotically is that they have not gained sufficient
experience to understand the consequences of such behavior. Because many effective
software actions such as design and code reviews or test data analysis do not appear to
directly support shipping the product, they seem expendable.
 It is much like driving an automobile. Few drivers with any experience will
continue driving for very long when the engine warning light comes on, regardless of
their rush. Similarly, most drivers starting on a new journey will, regardless of their hurry,
pause to consult a map.
 They have learned the difference between speed and progress.
 In software, coding and testing seem like progress, but they are often only wheel-
spinning.
 While they must be done, there is always the danger of going in the wrong
direction. Without a sound plan and a thoughtful analysis of the problems, there is no way
to know. Organizations at the Initial Process level can improve their performance by
instituting basic project controls.
 The most important are: Project management. The fundamental role of a project-
management system is to ensure effective control of commitments. This requires adequate
preparation, clear responsibility, a public declaration, and a dedication to performance.’
 For software, this starts with an understanding of the job’s magnitude. In any but the
simplest projects, a plan must then be developed to determine the best schedule and the
resources required. In the absence of such an orderly plan, no com moment can be better
than an educated guess. Management oversight.
 A disciplined software-development organization must have senior management
oversight. This includes review and approval of all major development plans before
official commitment. Also, a quarterly review should be conducted of facility-wide
process
compliance, installed-quality performance, schedule tracking, cost trends, computing
service, and quality and productivity goals by project.
 The lack of such reviews typically results in uneven and generally inadequate
implementation of the process as well as in frequent over commitments and cost surprises.
1 Quality assurance.
Software Process and Project Management

 A quality assurance group is charged with assuring management that the software-
development work is actually done the way it is supposed to be done.
 To be effective, the assurance organization must have an independent reporting line to
senior management and sufficient resources to monitor performance of all key planning,
implementation, and verification activities.
 This generally requires an organization of about 5 to 6 percent the sire of the development
organization. Change control. Control of changes in software development is fundamental
to business and financial control as well as to technical stability.
 To develop quality software on a predictable schedule, the requirements must be
established and maintained with reasonable stability throughout the development cycle.
Changes will have to be made, but they must be managed and introduced in an orderly
way.
 While occasional requirements changes are needed, historical evidence demonstrates that
many of them can be deferred and phased in later. If all changes are not controlled, orderly
design, implementation, and testing is impossible and no quality plan can be effective.
The Repeatable Process:
 The Repeatable Process has one important strength over the Initial Process: It provides
commitment control.
 This is such an enormous advance over the Initial Process that the people in the
organization tend to believe they have mastered the software problem.
 They do not realize that their strength stems from their prior experience at similar
work. Organizations at the Repeatable Process level thus face major risks when they are
presented with new challenges. Examples of the changes that represent the highest risk at
this level are: New tools and methods will likely affect how the process is performed, thus
destroying the relevance of the intuitive historical base on which the organization relies.
 Without a defined process framework in which to address these risks, it is even possible
for a new technology to do more harm than good. When the organization must develop a
new kind of product, it is entering ne\+ territory.
 For example, a software group that has experience developing compilers will likely have
design, scheduling, and estimating problems if assigned to write a control program.
Similarly, a group that has developed small, self-contained programs will not understand
the interface and integration issues involved in large-scale projects.
 These changes again destroy the relevance of the intuitive historical basis for the
organization's work. Major organization changes can be highly disruptive.
 In the Repeatable Process organization, a new manager has no orderly basis for
understanding what is going on and new team members must learn the ropes through word
of mouth.
 The key actions required to advance from the Repeatable Process to the Defined Process
are:
1. Establish a process group. A process group is a technical group that focuses exclusively
on improving the software development process. In most software organizations, people
are entirely devoted to product work. Until someone is given a full-time assignment to
work on the process, little orderly progress can be made in improving it. The
responsibilities of process groups include defining the development process, identifying
technology needs and opportunities, advising the projects, and conducting quarterly

2
Software Process and Project Management

management reviews of process status and performance. Typically, the process group
should be about 1 to 3 percent the size of the development organization. Because of the
need for a nucleus of skills, groups smaller than about four are unlikely to be fully
effective. Small organizations that lack the experience base to form a proce5s group
should address these issues through specially formed committees of experienced
professionals or by retaining consultants.
2. Establish a software-development process architecture that describes the technical and
management activities required for proper execution of the development process.4 The
architecture is a structural decomposition of the development cycle into tasks, each of
which has entry criteria, functional descriptions, verification procedures, and exit criteria.
The decomposition continues until each defined task is performed by an individual or
single management unit.
3. If they are not already in place, introduce a family of software-engineering methods and
technologies. These include design and code inspections, formal design methods, library-
control systems, and comprehensive testing methods. Prototyping should also be
considered, along with the adoption of modern implementation languages.
The Defined Process:
 With the Defined Process, the organization has achieved the foundation for major and
continuing progress.
 For example, the development group, when faced with a crisis, will likely
continue to use the Defined Process.
 The foundation has now been established for examining the process and deciding how to
improve it.
 As powerful as the Defined Process is, it is still only qualitative: There is little data to
indicate what is going on or how effective the process really is. There is considerable
debate about the value of software-process measurements and the best ones to use.
 This uncertainty generally stems from a lack of process definition and the
consequent confusion about the specific items to be measured. With a defined process, we
can focus the measurements on specific tasks.
 The process architecture is thus an essential prerequisite to effective measurement. The
key steps"' to advance to the Managed Process are:
1. Establish a minimum, basic set of process measurements to identify the quality and cost
parameters of each process step. The objective is to quantify the relative costs and benefits
of each major process activity, such as the cost and yield of error detection and correction
methods.
2.Establish a process database with the resources to manage and maintain it. Cost and
yield data should be maintained centrally to guard against loss, to make it available for all
projects, and to facilitate process quality and productivity analysis.
3.Provide sufficient process resources to gather and maintain this data and to advise
project members on its use. Assign skilled professionals to monitor the quality of the
data before
entry in the database and to provide guidance on analysis methods and interpretation.
4.Assess the relative quality of each product and inform management where quality
targets are not being met. An independent quality-assurance group should assess the
quality actions of each project and track its progress against its quality plan. When this
progress is
3
Software Process and Project Management

compared with the historical experience on similar projects, an informed assessment


generally can be made.
The Managed Process:
 In advancing from the Initial Process via the Repeatable and Defined Processes to the
Managed Process, software organizations typically will experience substantial quality
improvements.
 The greatest potential problem with the Managed Process is the cost of gathering data.
There are an enormous number of potentially valuable measure5 of software development
and support, but such data is expensive to gather and maintain.
 Therefore, approach data gathering with care and precisely define each piece of data in
advance. Productivity data is generally meaningless unless explicitly defined. For
example, the simple measure of lines of source code per development month can vary by
100 times of more, depending on the interpretation of the parameters.
 The code count could include only new and changed code or all shipped instructions. For
modified programs, this can cause a ten-times variation. Similarly, you can use non
comment, nonblank lines, executable instructions, or equivalent assembler instructions,
with variations again of up to seven times.' Management, test, documentation, and support
personnel may or may not be counted when calculating labor months expended.
 Again, the variations can run at least as high as seven times.' When different groups gather
data but do not use identical definitions, the results are not comparable, even if it made
sense to compare them. The tendency with such data is to use it to compare several groups
and put pressure on those with the lowest ranking. This is a misapplication of process
data.
 First, it is rare that two projects are comparable by any simple measures. The variations in
task complexity caused by different product types can exceed five to one. Similarly, the
cost per line of code of small modifications is often two to three times that for new
programs.
 The degree of requirements change can make an enormous difference, as can the design
status of the base program in the case of enhancements. Process data must not be used to
compare projects or individuals. Its purpose is to illuminate the product being developed
and to provide an informed basis for improving the process.
 When such data is used by management to evaluate individuals or teams, the reliability of
the data itself will deteriorate.
 The US Constitution's Fifth Amendment, which protects against self-incrimination, is
based on sound principles:
 Few people can be counted on to provide reliable data on their own performance.
 The two fundamental requirements to advance from the Managed Process to the
Optimizing Process are:
1. Support automatic gathering of process data. Some data cannot be gathered by hand, and
all manually gathered data is subject to error and omission.
2. Use this data to both analyze and modify the process to prevent problems and improve
efficiency.
The Optimizing Process:
 In varying degrees, process optimization goes on at all levels of process maturity. With the
step from the Managed to the Optimizing Process, however, there is a paradigm shift. Up
4
Software Process and Project Management

to this point, software development managers have largely focused on their products and
will typically only gather and analyze data that directly relates to product improvement.
 In the Optimizing Process, the data is available to actually tune the process itself. With a
little experience, management will soon see that process optimization can produce major
quality and productivity improvements.
 For example, many errors can be identified and fixed far more economically by code
inspections than through testing. Unfortunately, there is little published data on the costs
of finding and fixing errors.- However, I have developed a useful rule of thumb from
experience
 It takes about one to four working hours to find and fix a bug through inspections
and about 15 to 20 working hours to find and fix a bug in function or system test. It is
thus clear that testing is not a cost-effective way to find and fix most bugs.
 However, some kinds of errors are either uneconomical or almost impossible to find
except
by machine.
 Examples are errors involving spelling and syntax, interfaces, performance,
human factors, and error recovery.
 It would thus be unwise to eliminate testing completely because it provides a useful check
against human frailties.
 The data that is available with the Optimizing Process gives us a new perspective on
testing. For most projects, a little analysis shows that there are two distinct activities
involved.
 The first is the removal of bugs. To reduce this cost, inspections should be emphasized
together \\ith any other cost-effective techniques. The role of functional and system testing
should then be changed to one of finding symptoms that are further explored to see if the
bug is an isolated problem or if it indicates design problems that require more
comprehensive analysis.
 In the Optimizing Process, the organization has the means to identify the weakest elements
of the process and fix them. At this point in process improvement, data is available to
justify the application of technology to various critical tasks and numerical evidence is
available on the effectiveness with which the process has been applied to any given
product.
 We no longer need reams of paper to describe what is happening because simple yield
curves and statistical plots provide clear and concise indicators. It is now possible to
assure
the process and hence have confidence in the quality of the resulting products. People in
the process.
 Any software development process is dependent on the quality of the people who
implement it. Even with the best people, however, there is always a limit to what they can
accomplish.
 When engineers are already working 50 to 60 hours a week, it is hard to see how they
could
handle the vastly greater challenges of the future.
 The Optimizing Process helps in several \+ays: It helps managers understand where help
is needed and how best to provide the people with the support they require. It lets
professionals communicate in concise, quantitative terms.
 This facilitates the transfer of knowledge and minimizes the likelihood of their wasting
18time on problems that have already been solved.
 It provides the framework for the professionals to understand their work performance and
to see how to improve it.
Software Process and Project Management

 This results in a highly professional environment and substantial productivity benefits,


and it avoids the enormous amount of effort that is generally expended in fixing and
patching other people’s mistakes.
 The difference between a disciplined environment and a regimented one is that discipline
controls the environment and methods to specific standards while regimentation defines
the actual conduct of the work. Discipline is required in large software projects to ensure,
for example, that the people involved use the same conventions, don’t damage each
other’s products, and properly synchronize their work.
 Discipline thus enables creativity by freeing the most talented software professionals from
the many crises that others have created. The need. There are many examples of disasters
caused by software problems, ranging from expensive missile abort5 to enormous
financial losses.
 As the computerization of our society continues, the public risks due to poor-quality code
will become untenable. Not only are our systems being used in increasingly sensitive
application5, but they are also becoming much larger and more complex. While proper
questions can be raised about the size and complexity of current systems, they are human
creations and they will, alas, continue to be produced by humans - with all their failings
and creative talents.
 While many of the currently promising technologies hill undoubtedly help, there is an
enormous backlog of needed functions that will inevitably translate into vast amounts of
code. More code means increased risk of error and, when coupled with more complexity,
these systems will become progressively less testable.
 The risks will thus increase astronomically as we become more efficient at producing
prodigious amounts of new code. As hell as being a management issue, quality is an
economic one. It is always possible to do more inspections or to run more tests, but it
costs timeand money to do so.
 It is only with the Optimizing Process that the data is available to understand the costs and
benefits of such work. The Optimizing Process thus provides the foundation for significant
advances in software q u al i t y and si in u It anew u s improvements in productivity.
There is little data on how long it takes for software
organizations to advance through these maturity levels toward the Optimizing Process. Based on
my experience, transition from level 1 to level 2 or from level 2 to level 3 take from one to three
years, even with a dedicated management commitment to process improvement. To date, no
complete organizations have been observed at levels 4 or 5. To meet society’s needs for increased
system functions while simultaneously addressing the problems of quality and productivity,
software managers and professionals must establish the goal of moving to the Optimizing
Process.
PROCESS REFERENCE MODELS
Capability Maturity Model (CMM):

The Capability Maturity Model (CMM) is a


methodology used to develop and refine an organization's software development process. The
model describes a five-level evolutionary path of increasingly organized and systematically more
mature processes. CMM was developed and is promoted by the Software Engineering Institute
(SEI), a research and development center
Software Process and Project Management

sponsored by the U.S. Department of Defense (DoD). SEI was founded in 1984 to address
software engineering issues and, in a broad sense, to advance software engineering
methodologies. More specifically, SEI was established to optimize the process of developing,
acquiring, and maintaining heavily software-reliant systems for the DoD. Because the processes
involved are equally applicable to the software industry as a whole, SEI advocates industry-wide
adoption of the CMM

The CMM is similar to ISO 9001, one of the ISO 9000 series of standards specified by the
International Organization for Standardization (ISO). The ISO 9000 standards specify an effective
quality system for manufacturing and service industries; ISO 9001 deals specifically with
software development and maintenance. The main difference between the two systems lies in
their respective purposes: ISO 9001 specifies a minimal acceptable quality level for software
processes, while the CMM establishes a framework for continuous process improvement and is
more explicit than the ISO standard in defining the means to be employed to that end.

CMM's Five Maturity Levels of Software Processes

 At the initial level, processes are disorganized, even chaotic. Success is likely to depend on
individual efforts, and is not considered to be repeatable, because processes would not be
sufficiently defined and documented to allow them to be replicated.

 At the repeatable level, basic project management techniques are established, and successes
could be repeated, because the requisite processes would have been made established, defined,
and documented.

 At the defined level, an organization has developed its own standard software process through
greater attention to documentation, standardization, and integration.

 At the managed level, an organization monitors and controls its own processes through data
collection and analysis.

 At the optimizing level, processes are constantly being improved through monitoring
feedback from current processes and introducing innovative processes to better serve the
organization's particular needs
Software Process and Project Management

.
Software Process and Project Management

Difference between CMM and CMMI


CMM is a reference model of matured practices in a specified discipline like Systems Engineering
CMM, Software CMM, People CMM, Software Acquisition CMM etc., but they were difficult to
integrate as and when needed. CMMI is the successor of the CMM and evolved as a more
matured set of guidelines and was built combining the best components of individual disciplines
of CMM (Software CMM, People CMM, etc.). It can be applied to product manufacturing,
people management, software development, etc. CMM describes about the software engineering
alone
Software Process and Project Management

where as CMM Integrated describes both software and system engineering. CMMI also
incorporates the Integrated Process and Product Development and the supplier sourcing.

CMMI:
What is CMMI?
CMM Integration project was formed to sort out the problem of using multiple CMMs. CMMI
product team's mission was to combine three Source Models into a single improvement
framework for the organizations pursuing enterprise-wide process improvement. These three
Source Models are:  Capability Maturity Model for Software (SW-CMM) - v2.0 Draft C. 
Electronic Industries Alliance Interim Standard (EIA/IS) - 731 Systems Engineering.  Integrated
Product Development Capability Maturity Model (IPD-CMM) v0.98 . CMM Integration 
Builds an initial set of integrated models.  Improves best practices from source models based on
lessons learned.  Establishes a framework to enable integration of future models.

CMMI and Business Objectives:


The objectives of CMMI are very obvious. They are as follows:
Produce quality products or services: The process-improvement concept in CMMI models
evolved out of the Deming, Juran, and Crosby quality paradigm: Quality products are a result of
quality processes. CMMI has a strong focus on quality-related activities including requirements
Software Process and Project Management

management, quality assurance, verification, and validation.  Create value for the stockholders:
Mature organizations are more likely to make better cost and revenue estimates than those with
less maturity, and then perform in line with those estimates. CMMI supports quality products,
predictable schedules, and effective measurement to support the management in making accurate
and defensible forecasts. This process maturity can guard against project performance problems
that could weaken the value of the organization in the eyes of investors.  Enhance customer
satisfaction: Meeting cost and schedule targets with high quality products that are validated
against customer needs is a good formula for customer satisfaction. CMMI addresses all of these
ingredients through its emphasis on planning, monitoring, and measuring, and the improved
predictability that comes with more capable processes.  Increase market share: Market share is a
result of many factors, including quality products and services, name identification, pricing, and
image. Customers like to deal with suppliers who have a reputation for meeting their
commitments.
Gain an industry-wide recognition for excellence: The best way to develop a reputation for
excellence is to consistently perform well on projects, delivering quality products and services
within cost and schedule parameters. Having processes that conform to CMMI requirements can
enhance that reputation.
The CMMI is structured as follows:  Maturity Levels (staged representation) or Capability
Levels (continuous representation)  Process Areas  Goals: Generic and Specific  Common
Features  Practices: Generic and Specific This chapter will discuss about two CMMI
representations and rest of the subjects will be covered in subsequent chapters. A representation
allows an organization to pursue different improvement objectives. An organization can go for
one of the following two improvement paths. Staged Representation The staged representation is
the approach used in the Software CMM. It is an approach that uses predefined sets of process
areas to define an improvement path for an organization. This improvement path is described by
a model component called a Maturity Level. A maturity level is a well-defined evolutionary
plateau towards achieving improved organizational processes. CMMI Staged Representation 
Provides a proven sequence of improvements, each serving as a foundation for the next.  Permits
comparisons across and among organizations by the use of maturity levels.  Provides an easy
migration from the SW- CMM to CMMI.  Provides a single rating that summarizes appraisal
results and allows comparisons among organizations. Thus Staged Representation provides a pre-
defined roadmap for organizational improvement based on proven grouping and ordering of
processes and associated organizational relationships. You cannot divert from the sequence of
steps.

CMMI Staged Structure :


Following picture illustrates CMMI Staged Model Structure.
Software Process and Project Management

Continuous Representation Continuous representation is the approach used in the SECM and the
IPD-CMM. This approach allows an organization to select a specific process area and make
improvements based on it.
The continuous representation uses Capability Levels to characterize
improvement relative to an individual process area.
CMMI Continuous Representation
 Allows you to select the order of improvement that best meets your organization's
business objectives and mitigates your organization's areas of risk.
 Enables comparisons across and among organizations on a process-area-byprocess-area basis.
Provides an easy migration from EIA 731 (and other models with a continuous representation)
to CMMI. Thus Continuous Representation provides flexibility to organizations to choose the
processes for improvement, as well as the amount of improvement required.
Software Process and Project Management

CMMI Continuous Structure


The following picture illustrates the CMMI Continuous Model Structure.

Maturity levels for services[edit]


The process areas below and their maturity levels are listed for the CMMI for services model:
Maturity Level 2 - Managed

 CM - Configuration Management
 MA - Measurement and Analysis
 PPQA - Process and Quality Assurance
 REQM - Requirements Management
 SAM - Supplier Agreement Management
 SD - Service Delivery
 WMC - Work Monitoring and Control
 WP - Work Planning
Maturity Level 3 - Defined

 CAM - Capacity and Availability Management


 DAR - Decision Analysis and Resolution
 IRP - Incident Resolution and Prevention
 IWM - Integrated Work Managements
 OPD - Organizational Process Definition
 OPF - Organizational Process Focus...
Software Process and Project Management

 OT - Organizational Training
 RSKM - Risk Management
 SCON - Service Continuity
 SSD - Service System Development
 SST - Service System Transition
 STSM - Strategic Service Management
Maturity Level 4 - Quantitatively Managed

 OPP - Organizational Process Performance


 QWM - Quantitative Work Management
Maturity Level 5 - Optimizing

 CAR - Causal Analysis and Resolution.


 OPM - Organizational Performance Management

Advantages of CMMI
There are numerous benefits of implementing CMMI in an IT / Software Development
Organization, some of these benefits are listed below:

 Culture for maintaining Quality in projects starts in the mind of the junior programmers to
the senior programmers and project managers
 Centralised QMS for implementation in projects to ensure uniformity in the
documentation
which means less learning cycle for new resources, better management of project status
and health
 Incorporation of Software Engineering Best Practices in the Organizations as described in
CMMI Model
 Cost saving in terms of lesser effort due to less defects and less rework
 This also results in increased Productivity
 On-Time Deliveries
 Increased Customer Satisfaction
 Overall increased Return on Investment
 Decreased Costs
 Improved Productivity

Disadvantages of CMMI

 CMMI-DEV is may not be suitable for every organization.


 It may add overhead in terms of documentation.
 May require additional resources and knowledge required in smaller organizations to
initiate CMMI-based process improvement.
 May require a considerable amount of time and effort for implementation.
 Require a major shift in organizational culture and attitude.
Software Process and Project Management

PCMM:
Software Process and Project Management

Implementation support covers 5 points

1. Undertaking gap analysis to assess current level of maturity of organization as against


PCMM® level.
2. Establishing the level of P-CMM® at which organization is.
3. Recommendation whether to go for up-gradation in maturity level and suggesting
timelines for such up-gradation.
4. Inform recommendations for addressing key findings.
5. Develop roadmap and action plan for upgradation to next level along with specific
timelines.

Undertaking Gap Analysis to assess the current level of maturity of the organization as against
PCMM®.

This implementation support on assessing People CMM® Practices, covering the concepts and
principles, framework and will help to know how an organization has to assess the status of
People CMM® Goals and Practices being implemented in organizations. The process of
assessment is as follows;

 Preparing Phase – Preparing for assessment


 Surveying Phase – Conducting the People CMM® Practices (People Policies and
Procedures etc.,) survey
 Assessment phase – Conducting onsite assessment
Software Process and Project Management

 Reporting Phase – Reporting the assessment Results

2. Establishing the level of PCMM® at which organization is.

Guidelines on how to map the assessment results to the structural components of People CMM®.

Components in People CMM®

Based on the Maturity Level determined a presentation of Pros and Cons is to be made to the
authorities by making recommendations whether to go for up-gradation in maturity level and
suggesting timelines for such up-gradation. This shall facilitate for easy decision making by the
authorities.

3. Recommendation whether to go for up-gradation in maturity level and suggesting timelines for
such up-gradation.

4. Inform recommendations for addressing key findings.

Recommendations for addressing key findings at;

 Process area wise Goals level


 Process area wise Implementation Practices
 Process area wise Institutionalization Practices

Recommendations would include policies, procedures, practices, guidelines for Implementation


and institutionalization of the same.

5. Develop roadmap and action plan for up gradation to the next level along with specific timelines.

A roadmap and action plan for up gradation to next level along with specific timelines with date
wise action plan with the organizational authorities.

10 Principles of People Capability Maturity Model (PCMM)

1. In mature organizations, workforce capability is directly related to business performance.


Software Process and Project Management

2. Workforce capability is a competitive issue and a source of strategic advantage.


3. Workforce capability must be defined in relation to the organization’s strategic business
objectives.
4. Knowledge-intense work shifts the focus from job elements to workforce competencies.
5. Capability can be measured and improved at multiple levels, including individuals, workgroups,
workforce competencies, and the organization.
6. An organization should invest in improving the capability of those workforce competencies that
are critical to its core competency as a business.
7. Operational management is responsible for the capability of the workforce.
8. The improvement of workforce capability can be pursued as a process composed from proven
practices and procedures.
9. The organization is responsible for providing improvement opportunities, while individuals are
responsible for taking advantage of them.
10. Since technologies and organizational forms evolve rapidly, organizations must continually
evolve their workforce practices and develop new workforce competencies.

Level 2

Staffing

Communication and Coordination

Work Environment

Performance Management

Training and Development

Compensatio

n Level 3
Competency
Analysis

Workforce Planning

Competency Development

Career Development

Competency-Based Practices

Workgroup Development

Participatory Culture

Level 4

Competency Integration
Software Process and Project Management

Empowered Workgroups

Competency-Based Assets

Quantitative Performance Management

Organizational Capability

Management

Mentoring

Level 5

Continuous Capability Improvement

Organizational Performance Alignment

Continuous Workforce Innovation

PSP:
The Personal Software Process (PSP) is a structured software development process that is
designed to help software engineers better understand and improve their performance by bringing
discipline to the way they develop software and tracking their predicted and actual development
of the code. It clearly shows developers how to manage the quality of their products, how to make
a sound plan, and how to make commitments. It also offers them the data to justify their plans.
They can evaluate their work and suggest improvement direction by analyzing and reviewing
development time, defects, and size data. The PSP was created by Watts Humphreyto apply the
underlying principles of the Software Engineering Institute's (SEI) Capability Maturity Model
(CMM) to the software development practices of a single developer. It claims to give software
engineers the process skills necessary to work on a team software process (TSP) team.
Objectives
The PSP aims to provide software engineers with disciplined methods for improving personal
software development processes. The PSP helps software engineers to:

 Improve their estimating and planning skills.


 Make commitments they can keep.
 Manage the quality of their projects.
 Reduce the number of defects in their work.
PSP structure
PSP training follows an evolutionary improvement approach: an engineer learning to integrate the
PSP into his or her process begins at the first level – PSP0 – and progresses in process maturity to
the final level – PSP2.1. Each Level has detailed scripts, checklists and templates to guide the
engineer through required steps and helps the engineer improve her own personal software
process. Humphrey encourages proficient engineers to customize these scripts and templates as
they gain an understanding of their own strengths and weaknesses.
Software Process and Project Management

Process
The input to PSP is the requirements; requirements document is completed and delivered to the
engineer.
PSP0, PSP0.1 (Introduces process discipline and measurement)
PSP0 has 3 phases: planning, development (design, code, compile, test) and a post mortem. A
baseline is established of current process measuring: time spent on programming, faults
injected/removed, size of a program. In a post mortem, the engineer ensures all data for the
projects has been properly recorded and analysed. PSP0.1 advances the process by adding a
coding standard, a size measurement and the development of a personal process improvement
plan (PIP). In the PIP, the engineer records ideas for improving his own process.
PSP1, PSP1.1 (Introduces estimating and planning)
Based upon the baseline data collected in PSP0 and PSP0.1, the engineer estimates how large a
new program will be and prepares a test report (PSP1). Accumulated data from previous projects
is used to estimate the total time. Each new project will record the actual time spent. This
information is used for task and schedule planning and estimation (PSP1.1).
PSP2, PSP2.1 (Introduces quality management and design)
PSP2 adds two new phases: design review and code review. Defect prevention and removal of
them are the focus at the PSP2. Engineers learn to evaluate and improve their process by
measuring how long tasks take and the number of defects they inject and remove in each phase of
development. Engineers construct and use checklists for design and code reviews. PSP2.1
introduces design specification and analysis techniques
(PSP3 is a legacy level that has been superseded by TSP.)
Software Process and Project Management

Psp process flow:


Software Process and Project Management

One of the core aspects of the PSP is using historical data to analyze and improve process
performance. PSP data collection is supported by four main elements:

 Scripts
 Measures
 Standards
 Forms
The PSP scripts provide expert-level guidance to following the process steps and they provide a
framework for applying the PSP measures. The PSP has four core measures:

 Size – the size measure for a product part, such as lines of code (LOC).
 Effort – the time required to complete a task, usually recorded in minutes.
 Quality – the number of defects in the product.
 Schedule – a measure of project progression, tracked against planned and actual completion
dates.
Applying standards to the process can ensure the data is precise and consistent. Data is logged in
forms, normally using a PSP software tool. The SEI has developed a PSP tool and there are also
open source options available, such as Process Dashboard.
The key data collected in the PSP tool are time, defect, and size data – the time spent in each
phase; when and where defects were injected, found, and fixed; and the size of the product parts.
Software developers use many other measures that are derived from these three basic measures to
understand and improve their performance. Derived measures include:

 estimation accuracy (size/time)


 prediction intervals (size/time)
 time in phase distribution
 defect injection distribution
 defect removal distribution
 productivity
 reuse percentage
 cost performance index
 planned value
 earned value
 predicted earned value
 defect density
 defect density by phase
 defect removal rate by phase
 defect removal leverage
 review rates
 process yield
 phase yield
 failure cost of quality (COQ)
 appraisal COQ
Software Process and Project Management

appraisal/failure COQ ratio

Planning and tracking[edit]


Logging time, defect, and size data is an essential part of planning and tracking PSP projects, as
historical data is used to improve estimating accuracy.
The PSP uses the PROxy-Based Estimation (PROBE) method to improve a developer's
estimating skills for more accurate project planning. For project tracking, the PSP uses the earned
value method.
The PSP also uses statistical techniques, such as correlation, linear regression, and standard
deviation, to translate data into useful information for improving estimating, planning and quality.
These statistical formulas are calculated by the PSP tool.

Using the PSP[edit]


The PSP is intended to help a developer improve their personal process; therefore PSP developers
are expected to continue adapting the process to ensure it meets their personal needs.
PSP and the TSP[edit]
In practice, PSP skills are used in a TSP team environment. TSP teams consist of PSP-trained
developers who volunteer for areas of project responsibility, so the project is managed by the team
itself. Using personal data gathered using their PSP skills; the team makes the plans, the
estimates, and controls the quality.
Using PSP process methods can help TSP teams to meet their schedule commitments and produce
high quality software. For example, according to research by Watts Humphrey, a third of all
software projects fail,[3] but an SEI study on 20 TSP projects in 13 different organizations found
that TSP teams missed their target schedules by an average of only six percent.[4]
Successfully meeting schedule commitments can be attributed to using historical data to make
more accurate estimates, so projects are based on realistic plans – and by using PSP quality
methods, they produce low-defect software, which reduces time spent on removing defects in later
phases, such as integration and acceptance testing.
PSP and other methodologies[edit]
The PSP is a personal process that can be adapted to suit the needs of the individual developer. It
is not specific to any programming or design methodology; therefore it can be used with different
methodologies, including Agile software development.
Software engineering methods can be considered to vary from predictive through adaptive. The
PSP is a predictive methodology, and Agile is considered adaptive, but despite their differences,
the TSP/PSP and Agile share several concepts and approaches – particularly in regard to team
organization. They both enable the team to:

 Define their goals and standards.


 Estimate and schedule the work.
 Determine realistic and attainable schedules.
 Make plans and process improvements.

36
Software Process and Project Management

Both Agile and the TSP/PSP share the idea of team members taking responsibility for their own
work and working together to agree on a realistic plan, creating an environment of trust and
accountability. However, the TSP/PSP differs from Agile in its emphasis on documenting the
process and its use of data for predicting and defining project schedules.
Quality[edit]
High-quality software is the goal of the PSP, and quality is measured in terms of defects. For the
PSP, a quality process should produce low-defect software that meets the user needs.
The PSP phase structure enables PSP developers to catch defects early. By catching defects early,
the PSP can reduce the amount of time spent in later phases, such as Test.
The PSP theory is that it is more economical and effective to remove defects as close as possible
to where and when they were injected, so software engineers are encouraged to conduct personal
reviews for each phase of development. Therefore, the PSP phase structure includes two review
phases:

 Design Review
 Code Review
To do an effective review, you need to follow a structured review process. The PSP recommends
using checklists to help developers to consistently follow an orderly procedure.
The PSP follows the premise that when people make mistakes, their errors are usually predictable,
so PSP developers can personalize their checklists to target their own common errors. Software
engineers are also expected to complete process improvement proposals, to identify areas of
weakness in their current performance that they should target for improvement. Historical project
data, which exposes where time is spent and defects introduced, help developers to identify areas
to improve.
PSP developers are also expected to conduct personal reviews before their work undergoes a peer
or team review.

TSP:
In combination with the personal software process (PSP), the team software process (TSP)
provides a defined operational process framework that is designed to help teams of managers and
engineers organize projects and produce software the principles products that range in size from
small projects of several thousand lines of code (KLOC) to very large projects greater than half a
million lines of code. The TSP is intended to improve the levels of quality and productivity of a
team's software development project, in order to help them better meet the cost and schedule
commitments of developing a software system.
The initial version of the TSP was developed and piloted by Watts Humphrey in the late 1990s[5]
and the Technical Report[6] for TSP sponsored by the U.S. Department of Defense was published
in November 2000. The book by Watts Humphrey,[7] Introduction to the Team Software Process,
presents a view of the TSP intended for use in academic settings, that focuses on the process of
building a software production team, establishing team goals, distributing team roles, and other
teamwork-related activities
Software Process and Project Management

The primary goal of TSP is to create a team


environment for establishing and maintaining a self-directed team, and supporting disciplined
individual work as a base of PSP framework. Self-directed team means that the team manages
itself, plans and tracks their work, manages the quality of their work, and works proactively to
meet team goals. TSP has two principal components: team-building and team-working. Team-
building is a process that defines roles for each team member and sets up teamwork through TSP
launch and periodical relaunch. Team-working is a process that deals with engineering processes
and practices utilized by the team. TSP, in short, provides engineers and managers with a way that
establishes and manages their team to produce the high-quality software on schedule and budget.
TSP Works
Before engineers can participate in the TSP, it is
required that they have already learned about the PSP, so that the TSP can work effectively.
Training is also required for other team members, the team lead and management. The TSP
software development cycle begins with a planning process called the launch, led by a coach who
has been specially trained, and is either certified or provisional.[8][9] The launch is designed to
begin the team building process, and during this time teams and managers establish goals, define
team roles, assess risks, estimate effort, allocate tasks, and produce a team plan. During an
execution phase, developers track planned and actual effort, schedule, and defects meeting
regularly (usually weekly) to report status and revise plans. A development cycle ends with a Post
Mortem to assess performance, revise planning parameters, and capture lessons learned for
process improvement.
The coach role focuses on supporting the team and
the individuals on the team as the process expert while being independent of direct project
management responsibility.[10][11] The team leader role is different from the coach role in that,
team leaders are responsible to management for products and project outcomes while the coach is
responsible for developing individual and team performance
Software Process and Project Management
Software Process and Project Management

You might also like