Post Implementation Performance Review
Post Implementation Performance Review
September 2001
September 2001
ISBN 0 642 34449 3 Commonwealth of Australia 2001 This work is copyright. Apart from any use as permitted under the Copyright Act 1968, no part may be reproduced by any process without prior written permission from the National Archives of Australia. Requests and inquiries concerning reproduction and rights should be directed to the Publications Manager, National Archives of Australia, PO Box 7425, Canberra Mail Centre, ACT, 2610. Privacy Statement Collection of personal information from email The National Archives of Australia will only record your email address if you send us a message. The address will be used only for the purpose for which you have provided it. We will not use your email address for any other purpose and will not disclose it without your consent.
CONTENTS
H.1
H.2
H.3
H.4
CONDUCTING A POST-IMPLEMENTATIONREVIEW H.4.1 H.4.2 H.4.3 H.4.4 H.4.5 Plan the evaluation Collect and analyse performance data Document and report on your findings Take corrective action Make arrangements for ongoing review
5 6 7 9 10 10 10
H.5
CHECKLIST
H.6
WHATS NEXT?
11
ENDNOTE
11
H.1
WHAT IS A POST-IMPLEMENTATION REVIEW? The purpose of Step H is to measure the effectiveness of recordkeeping practices, focusing on the solutions implemented, against the recordkeeping requirements and organisational constraints defined in Steps C to E. You should also evaluate the efficiency and appropriateness of the development process (especially Steps F and G), recommend and undertake any remedial action to address deficiencies, and establish a monitoring regime for the duration of the system or related components. It involves interviewing management, staff and other stakeholders; conducting surveys; examining documentation developed during the earlier phases of the project; and observing and randomly checking operations. To complete Step H you need to: plan the evaluation, including assessment criteria (Section H.4.1); collect and analyse performance data (Section H.4.2); document and report on your findings to management (Section H.4.3); take corrective action (Section H.4.4); and make arrangements for ongoing review (Section H.4.5).
H.2
BENEFITS OF EVALUATING PERFORMANCE OF THE SYSTEM AND THE DEVELOPMENTAL PROCESS To date, your project has involved a considerable investment of resources time, money, staff and goodwill. It is important to demonstrate to management and other stakeholders with a vested interest in organisational accountability that the developmental process has been conducted efficiently, and that the new or improved recordkeeping systems and practices have the capacity to deliver stated benefits in the short and long term. A postimplementation review can provide such assurance. By completing the initial post-implementation review and conducting periodic checks you will: help guarantee a continuing return on the organisations investment by maintaining the recordkeeping system to optimal levels of performance throughout its life cycle; have objective proof that your organisation is creating and managing appropriate evidence of its business activities in accordance with operational, accountability and community expectations; minimise your organisations exposure to risk through system failure; and over time, anticipate significant changes in recordkeeping requirements and organisational needs that necessitate a new developmental cycle.
The scope of your post-implementation review will depend on the nature and extent of solutions implemented. Outcomes of Step H include: a methodology to objectively assess your recordkeeping system(s); documentation regarding the performance of the system and the developmental process that can be used for comparative purposes in the future; and a report to management documenting your findings and recommendations.
H.3
RESOURCES AND PREREQUISITES In order to undertake the post-implementation review you will need: a senior project manager; team members with analytical, interpersonal and communication skills, preferably familiar with recordkeeping and audit principles and processes and the IT sector; predefined performance benchmarks; and access to the records system (subject to security clearances) and supporting documentation.
In order to avoid any actual or perceived bias, the initial review should preferably be managed by personnel independent of the design and implementation process. This may require the appointment of a senior officer from elsewhere in the organisation or the use of external consultants (depending on the availability of resources, extent of in-house expertise, scope of the review and level of risk). Other members of the review team will need to understand the project goals, system design and implementation documentation, and performance data collected through the monitoring process, so that they are able to assess whether the system is adequately meeting the organisations recordkeeping requirements and organisational needs. Some of these members should include records management and IT specialists. Most of the design and implementation process should be completed before beginning a post-implementation review. However, some aspects of Step H can start concurrently with Step G. H.4 CONDUCTING A POST-IMPLEMENTATION REVIEW The specific circumstances, needs and culture of your organisation and the nature of the project will dictate the character and timing of the review. For example, a large organisation with multiple layers of management and complex business activities that implemented a new, organisation-wide electronic records system may choose to engage external consultants to 5
prepare an in-depth report with an executive summary. A small organisation with few unique or high-risk functions that redeveloped policies and procedures, implemented a thesaurus and improved staff training may opt for peer review and an oral presentation to management. An initial review should be carried out between six and twelve months after implementation, and then repeated on an agreed cycle.
H.4.1 Plan the evaluation
There are various types of evaluation depending on the particular questions you want answered. Generally, they fall into the following areas: Appropriateness appropriateness of solution compared to the organisations needs; objectives compared with available resources; and comparison of need now with original need.
Effectiveness original objectives compared with outcomes (what was desired and what was achieved); outcomes compared with needs; outcomes compared with standards; present outcomes compared with past outcomes; and comparison between target groups within the organisation.
Efficiency current costs compared with past costs (ie people, processes, technology and tools); costs compared with similar systems elsewhere (ie benchmarking); and extent of implementation compared with targets.
Put simply, an evaluation assesses did we do the right things? and did we do things right? It involves setting performance indicators to measure project outcomes (eg comparison of inputs to outputs, behavioural change, cost savings, level of satisfaction or involvement) and the project process (eg timeliness, teamwork, budget, satisfaction of sponsors and other stakeholders). Ideally an evaluation framework should be developed as part of the original design process so that performance data requirements can be built into management and administrative processes, and the review can be conducted with minimal intrusion on work practices or the delivery of services. At this point in the evaluation phase you should consider the scope of the review, relevant performance indicators and acceptable tolerance levels, the duration of the review and the form of reporting required. Remember that 6
you should evaluate not only any new or redesigned technological components, but recordkeeping practices as a whole: the people, processes and other tools such as the corporate thesaurus and records disposal authority. Depending on the complexity of the project and the level of resources available, it may be appropriate to conduct risk and feasibility assessments to determine the scope and emphasis of the postimplementation review. See Appendix 11 Risk analysis in DIRKS and Appendix 12 Recordkeeping feasibility analysis for assistance in identifying and evaluating risk and feasibility factors.
H.4.2 Collect and analyse performance data
Key tools for assessing performance are the benchmarks developed for Step D your identified recordkeeping requirements from Step C and required recordkeeping functionality. These documents should set out your organisations recordkeeping requirements (ie the need to create, capture, maintain and dispose of records) and organisational constraints (ie cultural, technical, economic, sociopolitical and other factors). Indeed, any reports arising from earlier steps that include recommendations on improvements to existing systems will help inform the review process (eg Step D). Such documentation will help establish performance indicators and frame questions on a range of issues such as the following: Records creation and retrieval Are records being created to adequately document the organisation's business activity? Are records being captured into appropriate systems in a timely, accurate and complete manner? Are records adequately preserved and accessible? Can users retrieve the records they need? Can users easily locate the records they need using the new system? Does the organisations business classification scheme and thesaurus adequately reflect functions, activities and transactions?
Management Is systems documentation adequate for operational and maintenance purposes? Have audit trails been established between the old and new systems? Have vital records and relevant control information from the old system been converted to the new system? Are staff still relying on unauthorised stand-alone or ad hoc systems in preference to the new system(s)? Are adequate security arrangements in place to protect sensitive records (taking into account privacy and confidentiality)? 7
Is the physical security of records adequate (eg locked cabinets, compactus)? Has the quality and level of records-based services changed since implementation from both the user and management perspective (eg ease of use, precision and coverage of search and retrieval)? How often are maintenance checks carried out? Has the disaster response plan been tested?
Disposal Does the organisation have comprehensive records disposal coverage? Does the coverage satisfy the needs of the organisation? Is it consistent with the appraisal requirements of the National Archives of Australia? Are the retention periods sufficient to fulfil business, accountability and societal requirements, or do they need to be revised to longer or shorter periods? Are records being sentenced and disposed of appropriately? Is there documentation to substantiate these actions (eg certificates of destruction, retention of control records, current disposal authorities)? Is the organisation creating any additional types of records that need to be appraised and incorporated in its disposal coverage? Are some of the vital records listed no longer being created?
Remember that any changes to disposal authorities need to be endorsed by the National Archives of Australia. Staff training Are personnel aware of their recordkeeping roles and responsibilities (eg through job descriptions, training)? Do personnel have access to up-to-date policy and procedural documentation? What problems have personnel experienced with the new system? Are personnel applying classification and titling conventions appropriately and consistently? Are personnel retrieving records using the appropriate tools?
Are staffing levels and competency skills adequate (within the records management area and among other staff)?
Learning from the process Were the original terms of reference sufficiently precise to guide the project? Did we negotiate for sufficient resources to carry out the project? Did we keep the key stakeholders informed and committed during the project? Could we improve our broad planning processes if we had to start afresh? Was the planning process appropriate to the projects size, complexity and predictability? Did our techniques for managing the project work well? Are the stakeholders satisfied? Did we complete the project on time? Did we complete the project within budget? Have we handed the project over as a going concern? Do we need a formal sign-off? Have we acknowledged everyone who made a contribution? Have we celebrated our success as a team (and as an organisation)?
Organisations may also wish to consult the final phase of the Model implementation plan and Check list for performance testing of records management systems that appear in Australian Standard AS 43901996, Records Management to identify other potential aspects for evaluation. [1] Performance data can be gathered by: interviewing stakeholders (eg project sponsor, senior management, business experts, records management staff and representative users); using questionnaires or surveys; observing the system in operation; examining procedures manuals, training materials and other documentation; carrying out random checks on the quality of records and control information; and obtaining computer-generated reports on usage figures for statistical analysis.
The performance data gathered on these issues will help you to document any variations or deviations to requirements defined in Steps C and D, 9
review the recordkeeping strategies chosen in Step E, identify areas that warrant priority, recommend corrective action, and propose mechanisms for ongoing monitoring. The review process and findings should be presented to management and a record created and retained for evidential and future reference purposes. This record may take the form of a written report, speaking notes or minutes. Keep in mind that circumstances may change over time and justification for decisions made or action taken may become very important. For this reason the review and any follow-up action should be formally endorsed by senior management, and all project management files and systems documentation should be brought up to date before the project team is dispersed.
H.4.4 Take corrective action
Any necessary remedial action should be undertaken as a consequence of the post-implementation review and the action documented. This may involve revisiting some of the steps in the DIRKS methodology.
H.4.5 Make arrangements for ongoing review
It is important for organisations to adopt evaluation strategies that can be used to regularly review their systems over time. All components of the system should be periodically examined to identify changes to recordkeeping requirements, respond to environmental changes (such as user requirements), assess the efficiency of technological components and anticipate the need for any modifications or systems redevelopment. H.5 CHECKLIST Before concluding the systems development process, ensure that you have: identified and documented performance indicators for the system and process; identified methods to collect performance data for the initial and subsequent reviews; gathered performance data; assessed whether the system is satisfying recordkeeping requirements and working within organisational constraints; produced substantiating documentation; assessed the design and implementation process and documented costs and benefits; reviewed the conversion strategy, process and audit trail; verified the existence, currency and comprehensiveness of technical, user and training documentation (including policies, procedures, records disposal authorities);
10
H.6
prepared a report for management on the status of the system and recommendations for improvement (including staffing issues); initiated remedial action endorsed by management; and established an ongoing cycle of reviews of recordkeeping.
WHATS NEXT? Most systems require significant redevelopment after four to ten years of operation to accommodate changes in organisational practices, business needs and technological infrastructure. At this point organisations should review Steps A to D of the DIRKS methodology.
ENDNOTE 1. See Australian Standard AS 43901996, Records Management, Part 3: Strategies, Appendix A, Clause A2.6 and Appendix C.
11