100% found this document useful (1 vote)
1K views

Program Handover Checklist

Check all agreed third party contracts / SOWs SLA's and those that are in the progress of being designed / agreed. Need to understand the deal (fixed price / T&M) and the objectives of the programme from a baseline perspective. Assess all software licences / warranties / support contracts and any in the pipeline for agreement, which are relevant to the programmes benefit objectives.

Uploaded by

chaps_online
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
1K views

Program Handover Checklist

Check all agreed third party contracts / SOWs SLA's and those that are in the progress of being designed / agreed. Need to understand the deal (fixed price / T&M) and the objectives of the programme from a baseline perspective. Assess all software licences / warranties / support contracts and any in the pipeline for agreement, which are relevant to the programmes benefit objectives.

Uploaded by

chaps_online
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 7

Programme Management Handover Checklist

RICHARD C.

CLARETY CONSULTING CHECKLIST

Please refer to my post http://www.claretyconsulting.com/it/comments/free-programme-management-handover-checklist/


2007-05-22 before using this checklist.
Check List Assumptions

Software House /Consultancy managing a programme on behalf of a customer


Large scale programme with budgets of 10 million +
No component projects have been completed.

Item #
1

1.2

2.1

2.2

Criteria

Yes

No

Performance History
What is your organisations experience in running programmes
of this type and size in the past? Any programme close reports
/minutes /artefacts from these programmes available. Has the past
experience been good and if not WHY NOT? Are these faults potentially
being repeated in your programme? What can you do to immunize your
programme against these potential failure points? Have these already
been taken? Dont repeat history. Remember, If you do what you
always do, you will get what you will always get?. History does repeat
itself
Check Contracts
What contracts /SOWs have been signed and agreed directly
with the client and what are in the progress of being
designed /agreed. Need to understand the deal (fixed price /T&M)
and the objectives of the programme from a baseline perspective. Also,
need to understand the financial constraints and penalties detailed in
these key documents. Do these documents identify any key payment
milestones? These documents are of last resort if all goes wrong in the
running of a programme in terms of protecting your employers balance
sheet so detailed knowledge of these artefacts is key.
Check all agreed third party contracts /SOWs SLAs and those
that are in the progress of being designed /agreed. Need to
understand the deal (fixed price /T&M) and what these third parties are
targeted to deliver. Need to find out your employers financial
commitment to these organisations going forward and their role in the
programme. Also, need understand the financial constraints and
penalties detailed in these key documents. Once again these
documents of last resort if all goes wrong in the running of a
programme in terms of protecting your employers balance sheet so
intermit knowledge of these artefacts is key.

Copyright @ Clarety Consulting

Page 1 of 7

Created 05/05/2016

Comment

Assess all software licences /warranties /support contracts and


any in the pipeline for agreement, which are relevant to the
programmes benefit objectives. How many users allowed, renewal
dates, costs and any issues associated with this agreements.
2.3

3
3.1

3.2

Financials
Does the programme have access to the clients Business Case?
What are the key benefits the client is looking for and how do
these relate to the contract in place. Do these align and if NOT WHY? This gap could be a potential source of additional
revenue through increased scope for your programme.

3.3

If you do not have access to the clients business case do you


have an agreed programme level business case?

3.5

Identify what is your organisations target margin. What net


profitability is your organisation hoping to make at the end of this deal.

3.6

Identify the presence of a fit for purpose time recording system

3.7

Is this time recording system kept up to date and if YES how


frequently is it updated and has the customer ever complained
about the billing arising from this system.

3.8

Check to see that earned value analysis and stats being


produced at project level? Is it being calculated consistently
between projects?

3.9

Assess the presence of a programme budgeting and monitoring


process.

3.10

Check the numbers for current margin, burn rate, forecast cost
for the programmes completion?

3.11

Assess the figures relating to specific projects, identify those in


the red, and thus need special attention. Decide and plan with
PMs an appropriate budgetary remediation plan.

Project Management Office (PMO)

4.1

4.2

4.3

Does your Programme have a PMO?


Has a PDD been produced and agreed by the client, defining
the role of the PMO /vision and strategy for the programme as a
whole. Is it comprehensive and fit for purpose?
Does the PDD contain a detailed programme blueprint?

Copyright @ Clarety Consulting

Page 2 of 7

Created 05/05/2016

4.4

4.6

4.7

4.8

4.9

4.10

4.11

4.12

4.13

Does the programme have an agreed Programme Brief? What


are the programmes drivers in terms of order Quality, Time, Cost, and
Scope? Are these reflected in the programme approach to date?
Does the programme have an agreed Programme Mandate?
Does the programme have a communications plan /strategy
How current is this plan and how frequently is it updated? What sources
are used to update it? Are the sources reliable and accurate Check QA
audit reports if present.
Does the programme have an agreed Quality strategy and
associated project level quality plans? Please note that later in this
checklist there is a separate section on Quality.
Carry out personal audit of PMO process /capability and
performance. Check to see that this team is right sized in terms of the
programmes complexity and size. i.e quality manager /team, financial
controller, programme planner, risk manager, resource manager, test
manager, project managers, Tech.
Who are your PMO team and what are their perceived roles and
responsibilities? Interview and identify their subjective views of the
programmes status and their perceived view of their role and
objectives. Do the perceived roles equate to those defined in the PDD.
Get team members to run through individually latest reports,
audits, plans, and risk /issue registers etc. Assess team capability
& right size experience.
Assess PMO team moral?
Does the programme follow a set of agreed methodologies
defined in the PDD and are these appropriate for the type of
programme currently underway.
i.e. Project Management - such as Practical Software and Systems
Measurement (PSM) framework, Software Engineering Institute's (SEI)
Capability Maturity Model (CMM) (for example, the Software
Acquisition (SA)-CMM, Capability Maturity Model Integration (CMMI), and
People-CMM), the Project Management Institute's (PMI) Body of
Knowledge (PMBOK) Guide, and SEI's IDEALSM (initiating, diagnosing,
establishing, acting, and learning) organizational improvement model.
Software Development Feature Development Methodology, SSADM
(rapid or waterfall), RUP etc.

4.14

Can you check for compliance with these methods? Important


because many software houses use fashionable method statements as
a sales tool and not a description of actually what happens.

4.15

Does each project have an agreed fit for purpose Project


Initiation Document? Does this conform with to other Programme
Level documentation i.e business case, PDD, QA & Test documents?

4.16

Does the programme have a risk /issue management? Is this fit


for purpose? Key Check to see that that the delivery team and client
(stakeholders) are bought into this important set of processes and are
actively participating in these processes in a meaningful and efficient
way. If not present set a out a plan to remediate this situation.

Copyright @ Clarety Consulting

Page 3 of 7

Created 05/05/2016

4.17

4.18

4.19

4.20

As part of the personal audit review latest quality management


audit reports /programme plans and associated project plans.
Check that these plans are of the correct quality; bottom-up planned
and are regularly updated with full staff participation. Are the plan
realistic and achievable in relation to current assumptions and
risk /issue profiles
Change Control procedures?
Does the programme have a change manager? Assess the change
process and how it operates. Is it sound and does it conform to the PDD
and other agreed programme level documentation?
How much change has taken place? Compare to the original
contract /business case. Have the changes gone off message in terms
of the original objectives laid down in the PDD /business case and
associated PIDs.

4.21

Does the programme have a programme /project workspace


capability? Quite important for version control and exposure of
transparency to all stakeholders and participants in the programme.

Stakeholder Management

5.1

5.2

5.3

5.4

5.6

5.7

Does the programme have a senior level client sponsor?


Is this sponsor active /participative and capable of providing
programme steer? When faced with issues that relate to client side
failings and errors, is this person willing and able to motivate
participating client teams to take corrective action in order to
mitigate /resolve such issues as they arise.
Does the programme have at programme and project level
stakeholder boards with senior users /departmental business
sponsors present?
QA representatives present on these boards?
Make appointment to see stakeholders and assess their
expectations, views on status? What do they see as their top 10
issues with the programme so far?
Are these boards meeting frequently and are they for filling
their roles as outlined in the PDD and PIDs. Are risk /issue
registers decreasing in size and are high impact items being
resolved /mitigated quickly.

Resource Management

6.1

Does the programme have a resource manager ? Interview


/meet and assess capability /experience and view on current
resource status.

6.2

Does the resource manager have the correct level of authority


over resource? Does this person have the ultimate ability to remove
resource on and off projects? What is the persons authority?

6.3

Does the programme have high-level resource plans?

Copyright @ Clarety Consulting

Page 4 of 7

Created 05/05/2016

6.4

Does each project have resource levelled resource project


plans?

6.5

What resource constraints apply to the programme? Assess


impact now and going forward. What can be done to improve the
situation?

6.6

What is the degree of cross project resource dependency?


Assess risk of project /programme slippages. Is the resourcing under or
over optimised.

Quality Management /Process Improvement Team

7.1

7.2

Check to see if a QA Programme Strategy has been agreed and


that if so obtain a copy and check to see that it has a fit for
purpose QA scope for the programme, related to its size and
complexity i.e does it have QA checklist, cover software and
management QA processes and Audits, Reporting lines QA vision and
objectives.
Do all the projects have individually agreed QA plans with fit for
purpose content relating back to the authority provided in the
QA strategy. Do these documents contain QA review /audit timetables
and the roles and responsibilities in these audits? Do the documents
specify checklist detailing the quality standards expected for each
project.

7.3

Depending on programme size, check to see if there is a QA


Manager /team engaged and what they perceive their role is in
the delivery of the programme. Refer back to QA strategy.

7.4

What is the reporting line for the QA Team? Do they report to you
or directly to project /programme boards?

7.5

Is there QA buy in among the delivery team and the client?

7.6

Is quality checking activities actually taking place in accordance


with the QA strategy and associated plans?

7.7

Are these happening according to well-planned /structured


timetable? Are delivery teams (project management /software
engineers) aware of this timetable? Are checks frequent enough?

7.8

Are the audit checklists available beforehand to all staff subject


to audits and are the checklists fit for purpose in terms of
depth and coverage?

7.9

7.10

Does the QA team run a standardised document template


repository, which is maintained and enforced in order to make
sure that released documentation is of a common format and
standard?
Check for evidence of peer reviews of all documentation
/systems before releasing to the client.

Copyright @ Clarety Consulting

Page 5 of 7

Created 05/05/2016

7.11

Read all QA audit reports, completed QA questionnaires, and


assess in conjunction with the QA team the programmes actual
QA status and where the potential problem areas might be?
Agree QA remediation steps going forward to bottom out problem
areas.

7.12

Does the QA have a as part of the QA strategy a detailed


configuration management plan covering documentation
/systems and tools.

7.13

Project Status Reports? Check current latest. Assess status as


viewed by the client.

7.14

Do the development teams have to work to an agreed set of


coding standards? Are these fit for purpose bearing in mind the
programme size and complexity?

7.15

Are code reviews carried out?


Are Configuration Management conventions being followed?

7.16

Software Design /Development

8.1

Does the programme have a designated design authority


(Senior Tech Architect)? Does this person have the capability
and experience to fit the programmes scale and complexity?

8.2

8.3
8.4
8.5
8.4

Ask the Senior Tech Architect in line with any acceptance


criteria (Test Strategy) and the Business Case to explain the
Technical Solution being delivered to the client. Assess where
possible technical feasibility and complexity.
How are business requirements recorded and gathered? What
approach, tools, documents, standards?
Does the programme use Business Analysts BAs?
What is the experience and capabilities of the BA team? Assess.
Are all requirements agreed with the client?

8.5

Is usability design /testing carried out as part of the


requirements gathering process.

8.6

Are load and concurrency requirements gathered and properly


considered in the architectural design of the system.

8.6

How are client requirements extrapolated into low-level tech


requirements i.e. domain modelling and data schema design. Is the
Business Requirements model of sufficient details (level of abstraction)
and quality to allow tech design artefacts to be produced, which
accurately reflect these agreed requirements?

Copyright @ Clarety Consulting

Page 6 of 7

Created 05/05/2016

8.7

Are these tech design models of sufficient detail to produce


developer level Tech Specs that can form the basis of definable
and targetable work package deliverables? These are important
for planning; progress monitoring and QA related activities. With
offshore development teams, this is even more important as it is often
difficult for BAs and Tech Architects to explain ambiguities in design
over the telephone. Clarity is important.

8.8

Is the Tech Requirements documentation of sufficient detail and


quality to make the system maintainable once released?

8.9

Is the Tech documentation of sufficient detail to allow effective


and quality to make the system maintainable once released?

Testing

9.1

9.2

Does the programme have an agreed test strategy? This should


have the all important acceptance criteria for projects and the methods
to be used for testing systems and the how UAT will be conducted on
projects.
Is unit testing carried out by developers? How? When?

9.3

Are the business requirements gathered of sufficient clarity to


allow the development of test scripts?

9.4

Has the system been test scripted for all key flows through the
system?

9.5

Are the test scripts of sufficient quality to allow offshore


regressions testing if necessary?

9.6

Have these test scripts been signed off by the client as being fit
for purpose in terms of the originally agreed test strategy and
associated acceptance criteria?

9.7

What automated test tools are being used? What are the test
results?

9.8

Any bug reporting tools being used. i.e test director

9.9

What continuous build testing is being carried out? What are


the test results so far?

9.10

Has a load test plan been prepared for relevant high load
applications? Any test results so far?

9.11

Assess all UAT plans and what scripts will be used for UAT? Try
to make sure the scripts are the ones used for final in-house system
testing.

9.12

Has suitable test team been recruited? Can they be used to test all
programme deliverables from an e-to-e perspective?

NOTES

RICHARD C. SALCEDO

Copyright @ Clarety Consulting

Page 7 of 7

Created 05/05/2016

You might also like