0% found this document useful (0 votes)
102 views

Target: The Practice of Performance Indicators

KPI

Uploaded by

HBT frame
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
102 views

Target: The Practice of Performance Indicators

KPI

Uploaded by

HBT frame
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

practice-of-perf-indicators.

qxd 2/6/00 3:40 pm Page 1

M A N A G E M E N T P A P E R

on
target the practice
of performance
indicators
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 2

The Audit Commission promotes the best use of


public money by ensuring the proper stewardship of public
finances and by helping those responsible for public services to
achieve economy, efficiency and effectiveness.

The Commission was established in 1983 to appoint and regulate the


external auditors of local authorities in England and Wales. In 1990 its role
was extended to include the NHS. Today its remit covers more than 13,000
bodies which between them spend nearly £100 billion of public money
annually. The Commission operates independently and derives most of its
income from the fees charged to audited bodies.

Auditors are appointed from District Audit and private accountancy firms
to monitor public expenditure. Auditors were first appointed in the 1840s
to inspect the accounts of authorities administering the Poor Law. Audits
ensured that safeguards were in place against fraud and corruption and
that local rates were being used for the purposes intended. These founding
principles remain as relevant today as they were 150 years ago.

Public funds need to be used wisely as well as in accordance with the law,
so today’s auditors have to assess expenditure not just for probity and
regularity, but also for value for money. The Commission’s value-for-money
studies examine public services objectively, often from the users’
perspective. Its findings and recommendations are communicated through
a wide range of publications and events.

For more information on the work of the Commission, please contact:


Andrew Foster, Controller, The Audit Commission,
1 Vincent Square, London SW1P 2PN, Tel: 020 7828 1212
Website: www.audit-commission.gov.uk
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 3

O N T A R G E T

Contents
1
Introduction 5
5 Criteria for robust
performance indicators 16 6 Making indicators work

Controlling the number of


22

Relevant 16 indicators 22

Clear definition 16 Develop ‘action-focused’


indicators 22
Easy to understand and use 17
2
The different users and uses
of performance indicators 6 Report only indicators that
Comparable 17 are relevant to the user 22

Verifiable 18 Use composite or illustrative


indicators 22
Cost effective 18
Regular refinement of
Unambiguous 18 indicators 23
3
Developing effective
performance indicators 8
Attributable 18 Setting targets 24
Focus on the right topics 8
Responsive 19 Avoid common pitfalls 24
Focus on the right measures 8
Avoid perverse incentives 19
The ‘Three Es’ 9
The balanced scorecard 10 Allow innovation 19
Best value performance
indicators 10 Statistically valid 20
NHS Performance Assessment
Framework
The NHS Wales Performance
10
Timely

Assessing the importance of


20
7 Conclusion: what to
do next? 26

Management Framework 10 the criteria 20

Bibliography 27

4
Types of indicators 11

Creating outcome measures 11

The quality dimension 12

Cross-cutting indicators 14

Getting the balance right 15

3
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 4

M A N A G E M E N T P A P E R • O N T A R G E T

© Audit Commission 2000

First published in June 2000 by the Audit Commission for Local Authorities
and the National Health Service in England and Wales, 1 Vincent Square,
London SW1P 2PN.

Typeset by Ministry of Design, Bath.

Printed in the UK for the Audit Commission by Kent Litho.

Illustrations by Patrick MacAllister.

ISBN 1 86240 228 0

4
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 5

I N T R O D U C T I O N

1. Introduction
1. The introduction of best value management of services. This is Improve: The Principles of
in local government, the described more fully in the Audit Performance Measurement. These
Performance Assessment Framework Commission paper Aiming to principles are listed in Box A.
(PAF) in the NHS and in social
services, the NHS Wales
Performance Management BOX A
Framework (PMF), and Public
Service Agreements (PSAs) in central The six principles of a performance measurement system
government have emphasised the • Clarity of purpose. It is important to understand who will use
information, and how and why the information will be used.
importance of having good
Stakeholders with an interest in, or need for, performance information
performance indicators (PIs) as part should be identified, and indicators devised which help them make
of performance management in the better decisions or answer their questions.
public sector. To complement the • Focus. Performance information should be focused in the first instance
on the priorities of the organisation – its core objectives and service
national sets of PIs, public sector areas in need of improvement. This should be complemented by
organisations are expected to set information on day-to-day operations. Organisations should learn how
indicators affect behaviour, and build this knowledge into the choice
their own PIs and targets, and to
and development of their performance indicators.
monitor and publish their • Alignment. The performance measurement system should be aligned
performance. with the objective-setting and performance review processes of the
organisation. There should be links between the performance indicators
2. The Commission has extensive used by managers for operational purposes, and the indicators used to
experience of developing and using monitor corporate performance. Managers and staff should understand
and accept the validity of corporate or national targets.
performance indicators. This paper
• Balance. The overall set of indicators should give a balanced picture of
on The Practice of Performance the organisation’s performance, reflecting the main aspects, including
Indicators sets out the lessons that outcomes and the user perspective. The set should also reflect a balance
between the cost of collecting the indicator, and the value of the
have been learnt. It is aimed at
information provided.
helping managers and practitioners • Regular refinement. The performance indicators should be kept up to
in local government, the NHS and date to meet changing circumstances. A balance should be struck
central government to develop their between having consistent information to monitor changes in
performance over time, taking advantage of new or improved data, and
own set of balanced and focused reflecting current priorities.
indicators. It describes ways to • Robust performance indicators. The indicators used should be
ensure that PIs are robust and well sufficiently robust and intelligible for their intended use. Independent
scrutiny, whether internal or external, helps to ensure that the systems
framed.
for producing the information are sound. Careful, detailed definition is
essential; where possible, the data required should be needed for day-
3. Indicators should be used within
to-day management of the services.
a wider framework of performance
Source: Audit Commission
measurement systems, performance
management and overall strategic

5
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 6

M A N A G E M E N T P A P E R • O N T A R G E T

2. The different users and uses of performance


indicators
4. A performance measurement objectives, but should also can be made. PIs can be used in
system can have a wide range of address operational objectives, this process to identify good
users, who may use the information assessing the performance of performance and problem
in different ways. These different day-to-day activities. areas, and to measure the
requirements need to be recognised • Promote the accountability of effect of action.
when devising performance service providers to the public • Promote service improvement
indicators. Users of performance and other stakeholders. The by publicising performance
information include: public and politicians may use levels. PIs can be employed to
• service users: direct (visitors at PIs to hold organisations encourage service improvement
the library, passport applicants accountable. The publication of by using nationally published
or patients) and indirect an organisation’s PIs can comparative information to
(relatives and parents); increase the awareness of identify whether organisations
• the general public, including consumers and citizens of the are performing well or poorly.
interest groups and the media; level of services they receive,
These uses can be seen as forming a
and whether the organisation is
• central government; continuous process from
living up to its commitments.
• politicians (local and central), operational management to
• Compare performance to national performance indicators
local councillors and non-
identify opportunities for [EXHIBIT 1].
executive directors of trusts and
improvement. Performance
health authorities; 6. Nationally prescribed indicators
indicators may also be used to
• auditors and inspectors; are only a small proportion of all
identify opportunities for
• managers at all levels in the the performance indicators that
improvement through
organisation; and might be used. Underpinning the
comparison both within the
national indicators, which reflect
• staff. organisation over time or
national priorities and are intended
between different units or
5. Performance information is not for national publication, are local
organisations. In recent years,
an end in itself. It may be used to: performance indicators. These will
there has been an increase in
• Measure progress towards reflect local objectives, and will be
benchmarking activity in all
achieving corporate objectives supported by indicators that
parts of the public sector.
and targets. Organisations can provide managers with information
Benchmarking aims to share
use PIs to monitor the that allows them to run a particular
understanding of how
achievement of corporate service effectively. Operational
organisations perform, identify
objectives. PIs should be set at indicators can also be related to
which processes explain
an overarching level to monitor personal performance targets for
differences in performance, and
achievement of the strategic staff and managers, making the link
where and how improvements

6
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 7

T H E D I F F E R E N T U S E R S A N D U S E S O F P E R F O R M A N C E I N D I C A T O R S

EXHIBIT 1
The different users and uses of indicators
Indicators should form a coherent set, with operational indicators supporting the
publication of local and national indicators.

Users Use

Public & stakeholders


Monitor key priorities
Government, politicians
National publication
& senior managers
National Accountability
indicators

Public & stakeholders


Local politicians Setting & meeting
Senior managers Local local objectives
indicators Accountability

Managers
and staff

Management Day-to-day
information indicators management

Source: Audit Commission

from the overall objectives all the


way to the individual level.

7. But all users of indicators


should remember that the
indicators do not provide answers
to why differences exist but raise
questions and suggest where
problems may exist (acting as a
‘can-opener’). A PI is not an end in
itself. It is essential that users and
producers of PIs share the same
expectations of what a performance
indicator can be employed for, to
avoid the misuse of an indicator.

7
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 8

M A N A G E M E N T P A P E R • O N T A R G E T

3. Developing effective performance


indicators
8. The saying ‘What gets 11. Performance indicators should Focus on the right
measured gets done’ illustrates the focus on the actions and services measures
importance of the right things provided at each level in the
12. It is important to develop a
being measured and inappropriate organisation to achieve its
balanced set of indicators that
things being left out. If an objectives. High-level indicators will
reflect all the aspects of the service.
organisation does not measure address corporate issues: lower-level
While it is common to look initially
what it values, it will end up indicators will look at operational
at the information that is currently
valuing what can be measured. The and day-to-day matters. But
available, this should be reviewed
choice of indicators will have a organisations should be careful to
to identify any important gaps.
major impact on the operation and avoid the common pitfall of
There are several different
direction of the organisation, and measuring that which is easily
frameworks that can be used to do
knowledge of the factors driving measured, rather than that which
this.
behaviour and influencing should be measured.
performance thus becomes crucial.

9. An organisation tackling the


task of developing a suite of
performance indicators needs to
address two questions:
• what topics should the
indicators focus on?; and
• what aspects should be
EXHIBIT 2
measured?
Different aspects of performance

Focus on the right topics Economy, efficiency and effectiveness link inputs to outcomes.

10. Performance indicators should


in the first instance focus on aspects Economy Efficiency Effectiveness
of the service that are important for
the organisation. This means that
the organisation should be clear
Inputs Outputs Outcomes
about what it is seeking to achieve –
its core objectives – and about how
it will achieve them. It should also
be clear how it will know whether it
is achieving its objectives.
Cost effectiveness
Source: Audit Commission

8
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 9

D E V E L O P I N G E F F E C T I V E P E R F O R M A N C E I N D I C A T O R S

The ‘Three Es’ Effectiveness: ‘having the performance indicators have the
13. One common way of organisation meet the citizens’ category fair access).
developing performance indicators requirements and having a
17. A related approach to ensure a
is to use the three dimensions of programme or activity achieve its
balanced representation of the
economy, efficiency and established goals or intended aims’.
service is the differentiation
effectiveness [EXHIBIT 2]. These have An example is ‘the percentage of
between quality, cost and time
been central to the Audit library users who found the
[BOX B]. Cost reflects the financial
Commission’s work. book/information they wanted, or
side of the organisation’s activities,
reserved it, and were satisfied with
14. The basic measures when quality captures the features of a
the outcome’. Effectiveness is about
constructing the three Es are: service and their appropriateness
assessing whether the service is
for the user, and the time aspect
• cost: the money spent to actually achieving what it set out to
covers the responsiveness and speed
acquire the resources; do.
with which services are delivered.
• input: the resources (staff,
16. Some organisations add a Not all stakeholders will find these
materials and premises)
fourth E, equality or equity. This dimensions equally relevant, but
employed to provide the
dimension captures the degree to together cost, quality and time can
service;
which access to services is equitable, provide a simple way to ensure
• output: the service provided to and whether the services are balanced coverage in relation to a
the public, for example, in appropriate to the needs of all service area. See Measuring Up,
terms of tasks completed; and those who should be able to use CIPFA, 1998, for further
• outcome: the actual impact and them. This has been incorporated in information.
value of the service delivery. several measurement frameworks
(for example, both the NHS
15. The three dimensions of
Performance Assessment Framework
performance are defined as follows:
and the set of best value
Economy: ‘acquiring human and
material resources of the
appropriate quality and quantity at
the lowest cost’ (staff, materials, BOX B
premises). An example is the cost of
buying new books for a library. A rounded set of indicators
A balanced set of indicators can be constructed by using cost, time and
Efficiency: ‘producing the maximum quality measures.
output for any given set of resource Example: Average speed of processing a new housing benefit claim
inputs or using the minimum inputs (timeliness)
for the required quantity and Average cost of a claim (cost)
quality of service provided’. An Percentage of cases where the calculation of benefit was
correct (service quality)
example is the cost per visit to
public libraries. Source: Adapted from Measuring Up, CIPFA, 1998

9
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 10

M A N A G E M E N T P A P E R • O N T A R G E T

The balanced scorecard • qualitative as well as • quality: the quality of the


18. Another approach, originally quantitative performance, for service delivered, explicitly
developed by the private sector, example using checklists to reflecting users’ experience of
suggests that four perspectives – a assess how a service is delivered services; and
‘balanced scorecard’ – are needed in as well as the cost and volume • fair access: the ease and
an indicator set to provide a of the service; equality of access to services.
comprehensive view of the • strategic and operational issues;
performance of an organisation: NHS Performance Assessment
• process and outcome; and
Framework
• Service user perspective: how • the views of the main
does the organisation meet 20. The NHS Performance
stakeholder groups.
customers’ needs and Assessment Framework is founded
More information on the balanced on six areas for which performance
expectations?
scorecard approach can be found in indicators have been developed:
• Internal management
The Measures of Success:
perspective: the identification • health improvement;
Developing a Balanced Scorecard to
and monitoring of the key • fair access;
Measure Performance, Accounts
processes by which good quality • effective delivery of appropriate
Commission, 1998.
and effective services are healthcare;
provided. Best value performance • efficiency;
• Continuous improvement indicators
• patient/carer experience; and
perspective: securing 19. Another way to structure the
• health outcomes of NHS care.
continuous learning and measurements of the different
improvement processes in aspects of performance has been The NHS Wales Performance
systems and people, and thus used by Government in defining the Management Framework
services. best value performance indicators:
21. The NHS Wales Performance
• Financial (taxpayer’s) • strategic objectives: why the
Management Framework lists the
perspective: how resources are service exists and what it seeks
following areas:
used in an economic and to achieve;
efficient way to achieve the • fairness;
• cost/efficiency: the resources
objectives of the organisation. • effectiveness;
committed to a service, and the
Besides making sure that the PIs efficiency with which they are • efficiency;
cover these perspectives, it is often turned into outputs; • responsiveness;
helpful to consider whether • service delivery outcomes: how • integration;
indicators addressing the following well the service is being • accountability;
aspects of performance might be operated in order to achieve
• flexibility; and
appropriate: the strategic objectives;
• promoting independence.
• long- and short-term objectives;

10
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 11

T Y P E S O F I N D I C A T O R S

4. Types of indicators
Creating outcome service need to be established asking ‘has the problem been
measures (moving from the inner to the outer resolved, or the benefit been
circles). delivered?’ Answering the question
22. Outcome measures are crucial
may be difficult and require surveys
to monitoring the achievement of 24. Outcome measures – measures
or detailed one-off investigations.
service objectives. Having of effectiveness – depend on a clear
determined the objectives, it can, understanding of what the service is 25. Outcomes of services may also
however, be difficult to devise seeking to achieve. Economy and take a long time to emerge, such as
suitable outcome measures, efficiency indicators can usually be the impact of services on quality of
especially in complex areas such as constructed quite simply by looking life. The full benefits of pre-school
social services and the health at costs and at resource education may not be apparent
service. deployment. But to see whether the until much later in life. It may be
service is effective means going possible to monitor the service only
23. The ‘ripple effect’ approach can
back to the original problem, and by looking at process measures,
help to identify links between
objectives and outcome measures
for a service [EXHIBIT 3]. Outcome
measures can be quite difficult to
construct, as outcomes may be long EXHIBIT 3
term and influenced by various The ‘ripple effect’ approach
factors. A service could be viewed as A simplified example of how the ripple effect links objectives with outcome measures: the
a stone tossed into water, creating outer circle contains the less measurable outcome elements of the service (people able to
have a reasonable standard of living), the inner circle the more measurable indicators and
several ripples of effect around it.
targets (benefits paid accurately and on time).
The outer ripples reflect the
overarching objectives of the
service, and the inner ripples the
more measurable aspects of the
service. This process can be used in
two ways. First, it can be used to
Peo to
p ed
Benefits paid
le

develop indicators for the objectives


itl

accurately and
nt
re

of a service area (moving from the ce on time


ive e
outer to the inner circles towards a Pe t h e b en are
op ef i t s t h e y ds
le a l n ee
more and more measurable level). ble t
o mee t their b asic financi
a
Peop
Secondly, the method is also helpful le hav iving
e a reasonable standard of l
in situations where an outcome
measure exists but the links to the
Source: A Measure of Success, Audit Commission, 1999
overall strategic objectives of a

11
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 12

M A N A G E M E N T P A P E R • O N T A R G E T

such as the proportion of children 29. The Audit Commission it is important to bear in mind that
receiving pre-school education, as developed a concept of quality in quality is multidimensional.
these can be measured in a more Putting Quality on the Map, 1993, Providing a quality service means
timely way. Process indicators thus to reflect the fact that, in the public balancing the different factors of
measure the activities that will lead sector, financial constraints and the service standards, delivery and costs.
to the final results. To use a subsidy given to many services may The quality map [EXHIBIT 4]
‘substitute’ of this kind depends on mean that it is not always possible incorporates the following
evidence that process and outcome to satisfy users’ needs. Additionally, elements:
are linked or at least on a general
consensus that the process
contributes to the outcome.
Research may be needed to avoid EXHIBIT 4
process measures that lead the The quality map
organisation to the wrong The Commission emphasises the importance of communication as well as consistent
outcomes. delivery of appropriate standards, and economic use of resources.

26. One of the most recent


developments in relation to Quality of communication
outcome measures is the search for
cost-effectiveness indicators, where
links are sought between resources
Service user
and effectiveness or outcome. The
design of such measures is still in its
infancy, and will depend on the
existence of good outcome
measures.
Communicate Understand
Quality Service
The quality dimension
27. The quality of public services,
and the need to improve it, is a
Quality delivered
matter of concern to all involved. by provider
Performance measurement systems
therefore need to address the
question of quality. Quality of
Quality of Quality of
people &
delivery specification
28. Various definitions of quality systems
exist. Some of these focus on the
attributes of a service, others on
Effective use of resources
whether the service meets the
expectations of the user, and a third
Source: Putting Quality on the Map, Audit Commission, 1993
group on a mixture of the two.

12
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 13

T Y P E S O F I N D I C A T O R S

• Communication: the ‘map’


emphasises the importance of BOX C
understanding users’ priorities
and demands, and managing The importance of understanding stakeholders’ priorities

expectations through A metropolitan council sought to improve its housing service by renovating
its housing offices, improving the reception areas with carpeting and plants.
communicating with service It also provided facilities such as vending machines, toys for children and
users, advising them of the toilets. But a survey showed that these improvements had had little effect on
users’ satisfaction with the service. Further research showed that the aspects
standards that can be provided
of the service that had been improved were low on users’ priorities. What
within the resources available, users most wanted was to be able to see the same officer each time they
how services will be allocated or visited the housing offices. This issue had not been addressed by the council.
prioritised and incorporating Source: Audit Commission
this knowledge in the standards
of service specified. The
importance of understanding
stakeholders’ priorities is
• proxy measures of quality; 32. When basing a PI on consumer
further illustrated in Box C.
• qualitative (yes/no) indicators, surveys, it is important to have a
• Specification: questions the
for example, checklists of ‘good clear understanding of the target
standards of services that are
practice’ processes; and group(s) and whether they are
set. These should be based on
reached by the survey. The type of
balancing users’ needs and • professional assessment.
survey should also be considered:
wishes with the organisation’s 31. Quantitative indicators can in will it probe the users’ experience
resource constraints. some cases be devised to address of the service, or will it focus only
• Delivery: are the standards the quality aspect of a service. An on their satisfaction with the
delivered reliably? example is the ‘percentage of 999 service? For example, a survey on
• People & systems: are staff calls answered within local target library services may miss essential
trained to deliver services response time’. If paramedics reach information if it focuses only on the
effectively, and are they a heart attack victim quickly it can users and misses the non-users and
supported by good systems? save a life, and the police arriving the reasons why they do not visit
at a crime scene quickly can be libraries. Some guidance on how to
• Effective use of resources: does
important in reassuring victims and carry out user satisfaction surveys is
the service make good use of
following up incidents. Quantitative provided in Guidance on Methods
resources?
indicators can also address reliability of Data Collection published by the
30. A number of methods can be or accuracy issues: for example, ‘the Department of the Environment,
employed to gauge the quality of a percentage of trains running on Transport and the Regions, the
service: time’ and ‘the percentage of trains Housing Corporation and the
• quantitative indicators; cancelled’ measure the reliability of National Housing Federation in
• consumer surveys; the service. 2000, with the support of the Local
• number of complaints; Government Association,
Improvement and Development

13
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 14

M A N A G E M E N T P A P E R • O N T A R G E T

Agency and the Department of 35. Another tool for addressing there is a public benefit beyond the
Social Security. quality is the use of qualitative individual. It has been argued that
(‘yes/no’) indicators. A qualitative in these cases ‘objective measures’
33. The number of complaints can
indicator can be developed using a of the ‘right’ level of quality should
be used as an indicator of quality.
series of ‘yes/no’ questions to be established based on
An increasing number of complaints
describe quality. For example, the expert/professional opinion. But
about a service can be a valuable
way in which a planning application understanding services from the
indicator that problems are
is handled can have an impact on users’ perspective and developing
developing. However, such an
applicants’ perception of service user-focused indicators should never
indicator may be unsuitable for
quality. ‘Do you provide pre- be overlooked, and professional
comparing an organisation with
application discussions with assessment should be used in
another due to differences in
potential applicants on request?’, conjunction with user-focused
expectations about service levels. A
‘Do you have a publicised charter indicators.
low level of complaints could
which sets targets for handling the
indicate that service users have
different stages of the development Cross-cutting indicators
given up due to bad experience
control process?’, and ‘Do you 38. Some services can be
with the handling of complaints or
monitor your performance against significantly influenced by several
that access to the complaints system
these targets?’ cover processes that agencies. Indicators that measure
is difficult.
have been found to affect users’ the collective performance of all the
34. Proxy measures are used when perception of quality. By creating agencies are referred to as ‘cross-
more direct measures are either not such checklists of ‘good practice’ cutting’ indicators. A well-chosen
available or very difficult or processes, important quality issues cross-cutting indicator, which all
expensive to collect. One example is can be captured. parties accept as relevant, and
visits to museums, which can act as which all can influence, can have a
36. Sometimes people may not
an indirect measure of quality powerful effect on improving joint
have an explicit view of what they
where comparisons have to be working and achieving better
actually want from a service or
made over time. Another example service delivery. In some cases,
what it is reasonable to expect. For
can be in an area where consumers individual influence can be hard to
example, though patients will be
have a choice of two or more determine, and all the organisations
able to judge whether they feel
facilities, where the relative usage involved may need to accept ‘joint
better or not after a treatment, it
of each can be taken as an and several’ responsibility for
may be difficult for them to
indication of the way that people improving performance. In other
determine whether the treatment
have balanced the issues of the cases, it may be possible to allocate
could have been better. In such
quality of the facility, its accessibility responsibility to individual
cases, it might be relevant to base
and cost. If cost and accessibility are organisations more precisely.
quality standards on professional
similar, the differences in usage can
assessment and inspection. 39. An example of an issue that
form a proxy measure for the
involves several agencies is the need
relative quality of each facility. It is, 37. In other cases, the expectations
to reduce and prevent crime by
however, important to use such of individual citizens may differ
young people. A number of service
proxy measures with caution. from society’s expectations, because

14
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 15

T Y P E S O F I N D I C A T O R S

providers need to work together to Getting the balance right


achieve results: schools, the police,
42. A single indicator rarely
housing, social services, probation
provides enough information on its
etc. The success of joint working can
own to give a comprehensive
be addressed by defining an
picture of performance in a service
indicator that all parties are able to
area. Furthermore, it contains the
agree as relevant: the level of re-
risk of skewing performance if
offending.
resources are shifted into activities
40. Vulnerable or elderly patients because they are being measured. A
often have continuing care needs single, or limited number of
after being discharged from acute indicators may also give a
hospital care in order to resume misleading picture. Organisations
independent life at home. Early need to balance the number of
discharge avoids bed blocking but indicators against the cost of
requires good discharge planning, collection, and the risks of skewing
with collaboration between the performance or misleading people.
hospital, relatives, community These risks can also be reduced by
health services and the local developing robust performance
authority. An indicator of indicators.
collaboration could be ‘hospital bed
days lost due to delayed discharge’
– the higher the figure, the poorer
the collaboration. An indicator of
good discharge and appropriate
rehabilitation could be ‘the
percentage of patients admitted
from their home who return to
their own home’.

41. An example of a cross-cutting


target within the NHS is: ‘to reduce
deaths from coronary heart disease
and stroke and related diseases in
people under 75 by at least two-
fifths by 2010’. From this high-level
objective, PIs and targets have been
set which require co-ordinated
action by primary and secondary
health care organisations including
hospitals and ambulance trusts.

15
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 16

M A N A G E M E N T P A P E R • O N T A R G E T

5. Criteria for robust performance indicators


43. There are a number of general differences in perspectives and tight may make it difficult for some
characteristics of indicators that can interests. One possibility is to of the data providers to deliver the
help to ensure that proposed conduct a stakeholder analysis and information, while definitions that
indicators will be useful and accordingly target the performance are too broad could allow for a
effective. The following examples information on the various groups number of different ways of
have been chosen to illustrate these and their respective needs. counting what is being measured.
characteristics. However, some of
47. Some of the data used to
the examples showing good Relevant
calculate a PI might already be
practice on the specific The indicator, ‘The average waiting
defined and collected by other
characteristic being discussed may time for a particular operation in a
given trust’, is relevant to both agencies. Using an existing
have weaknesses when considered prospective patients and to hospital definition can be helpful in
against other characteristics. managers.
ensuring consistency. Care should
In contrast, the indicator, ‘The total
also be taken to avoid a definition
Relevant number of people on the waiting list
for operations in a given trust at the that is close to, but different from,
44. Indicators should be relevant to start of each month’, is of less use to an existing definition, which could
the organisation. One way of the prospective patient, who will
worry more about the length of time duplicate the effort involved in data
helping to ensure the relevance is
that he or she will wait for an collection and lead to confusion.
to relate the performance indicators operation. The information can,
to the strategic goals and objectives however, constitute relevant
knowledge for senior managers and Clear definition
of the organisation or of a specific
the trust board members. A key indicator for NHS estate
service area. Such an approach will managers to monitor the
also limit the risk of setting up PIs management of water and sewerage
because data are available rather Clear definition services and thereby contain costs is
‘Water consumed per occupied bed
than to meet a need in the 46. A PI should have a clear and day’. Water supplied to hospital sites
organisation. intelligible definition in order to is metered and therefore easily
measured. Occupied bed days is a
ensure consistent collection and fair commonly used statistic in hospitals,
45. Indicators should ideally also be
comparison. Vague descriptions can collected through a well-defined and
relevant to the people providing well-known method of counting.
lead to misinterpretation and
the data. The danger is that if an
confusion. Care should be taken to The indicator defined as ‘Percentage
indicator is not seen as relevant, of patients at outpatient clinics seen
avoid making the definition too
they will not bother to collect the within 30 minutes of appointment
complex, so that people have time’ will need clarification if it is to
information accurately. Relevance
difficulty in collecting the be used for comparison between
to the user of the PI is also trusts: how do you treat patients who
information. Too tight or too broad arrive late, since their lateness, often
important, but it may not be
definitions may also create outside the influence of the trust, will
possible for a single indicator to be make the performance of the
problems. Definitions that are too
relevant to all users due to the organisation look less good? What

16
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 17

C R I T E R I A F O R R O B U S T P E R F O R M A N C E I N D I C A T O R S

about clinics where patients go to Comparable disaggregate data, to separate the


have something else done (for components that can be compared
example, have an ultrasound scan) 49. Indicators should ideally be
before they see a doctor? If such a from those that would distort
comparable on a consistent basis
definition is chosen, because it is results. As an example, an overall
relevant to patients, it is important to both between organisations and
indicator for a trust could be
specify whether latecomers are over time. The first ideal can be
excluded/included in the data, and divided into sub-elements, such as
difficult to achieve due to
whether note should be taken of their particular day care procedures or
other engagements at the trust. differences in data standards,
various cost elements. Decomposing
collection methods etc., and will
data in such a way may increase the
depend on well-thought-out,
Easy to understand and agreed definitions. The second
number of indicators but could be
use ideal, comparison over time, is
necessary to obtain usable
48. It is important that indicators comparative data.
often easier to achieve, as many
are described in terms that the user aspects of a service will not change,
of the information will understand, Comparable
or will change only slowly. But also
even if the definition itself has to ‘Number of visitors to the authority’s
in this case, the way data are
museum(s)’ would be valid on a local
use technical terminology. measured and collected within an basis over time, for example to show
Indicators focused on the public organisation may change even if the effectiveness of the museum’s
should avoid management jargon, promotion activities: are the public
the service itself does not. making more visits than before? But
or abstract concepts, such as using for inter-authority comparison, such
50. An essential aspect of the
‘full-time equivalent staff numbers’ an indicator could be misleading, as
comparability of performance the museums involved could vary
in a comparison where ‘staff
indicators is the inclusion of the considerably in size and scope and in
numbers’ would be more the type of area (tourist or residential)
context within which the they are situated in.
understandable.
comparison is taking place. External
Another example of a PI that might
or internal circumstances can differ be difficult to use for comparison is
Easy to understand and use
to such a degree that comparison is ‘Percentage of all procedures carried
The indicator, ‘Percentage of teachers out as day surgery’. Figures may be
who are female’, is easy for the public invalid. more a reflection of different case
to understand. The indicator, ‘full- mixes and definitions of procedures
time equivalent teachers per 1,000 51. If indicators are not directly
within hospitals rather than of real
population’, uses jargon that may not comparable, one possible solution performance, making comparison
be clear to the public. misleading.
could be to standardise data. For
example, numerical ratios can be Contextual information can help
make some indicators more useful for
used to level out demographic
comparison. The indicator, ‘The
differences between areas (like the amount spent on “meals on wheels”
example mentioned below of per head of population’, could be
distorted if the area has a high
spending per person over 75). proportion of elderly residents. Using
However, to create such ratios the context information on the
demands good and detailed numbers of over 75s to calculate the
spending on ‘meals on wheels’ per
information for the denominator. person over 75 would provide a more
An alternative may be to reasonable comparison.

17
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 18

M A N A G E M E N T P A P E R • O N T A R G E T

Verifiable of the time. Adding the phrase interpretation, and indicators


‘whose name is given to applicants should be designed and measured
52. The indicator also needs to be when the application is
acknowledged’ will make it possible so that improvement in the
collected and calculated in a way
to check the documentary evidence, indicator is possible only through an
that enables the information and in this case by reviewing a sample of
improvement in the service. Setting
data to be verified. The indicator files.
a target can often be helpful in
should allow aggregation and
reducing ambiguity.
disaggregation of the data so
Cost effective
recalculation can take place, and if
54. Another important criterion is Unambiguous
appropriate a description should be
to balance the cost of collecting The information in the indicator, ‘The
available of the statistical
information with its usefulness. percentage of case notes available at
techniques and sampling methods the start of outpatient clinics’, is
Where possible, an indicator should unambiguous: the higher the better.
used. The indicator should be based
be based on information already
on robust data collection systems, ‘The number of frauds detected by an
available and linked to existing data internal audit section’ could be
and it should be possible for
collection activities. Managers are interpreted in two ways. A low level
managers to verify the accuracy of of frauds might mean that the
likely to have already assessed the authority had been very effective in
the information and the consistency
costs and benefits of using the PIs reducing the incidence of frauds, so
of the methods used. This stresses
for managing their services, so such there are few to detect. It could
the importance of good internal equally well mean that the audit
indicators should inherently be cost section was not effective in detecting
quality controls in different levels of
effective. When a new indicator is fraud, and they were finding only a
the organisation. External reviews, small proportion of cases.
needed it should be designed to
for example in the form of external
minimise the burden on an
audits and inspections, should
ideally complement these internal
organisation and its employees of Attributable
assembling the information. This 56. Service managers should be
control processes.
will almost inevitably involve some able to influence the performance
53. All indicators should have degree of trade-off between the measured by the indicator (that is, it
evidence available, preferably costs of collection and analysis of should either be totally within their
documentary evidence, making it information, and the ideal PI. control or at least open to
possible to verify them. Obsolete indicators should be significant influence). If this is not
Documentary evidence can also add discarded to keep the costs of data the case, the incentives for making
weight to a qualitative (yes/no) collection to a minimum. an effort to improve performance
indicator for which there may be
will diminish, and the performance
little other evidence. Unambiguous
indicators may be regarded as
55. It should be clear whether an unfair, and discourage staff and
Verifiable increase in an indicator value managers. Cross-cutting issues can
The indicator, ‘Does the council have a represents an improvement or
policy of allocating a named officer to present particular problems, where
deal with each planning application?’ deterioration in service. A change in the need is for managers to accept
(a quality issue in processing planning an indicator should be capable of a joint responsibility for performance.
applications), could be taken to refer clear and unambiguous
to an informal policy, achieved some

18
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 19

C R I T E R I A F O R R O B U S T P E R F O R M A N C E I N D I C A T O R S

Avoid perverse incentives one with the longer delay is held


Attributable back to ensure the other train is on
The indicator, ‘Variance between 58. When constructing a time.
budget and actual spending’, could be performance indicator, it is
a good or bad PI depending on how it important to consider what
is applied. It is a good measure if Allow innovation
budgets are devolved to departments behaviour an indicator ought to
and the person responsible for encourage. Indicators that might 59. The definition of an indicator
expenditure is also the budget holder ought not to deter organisations
encourage counter-productive
and receives the necessary information
activity should be avoided if from developing innovative
to control the budget. But if it is used
for monitoring without devolving possible. Examples are PIs that processes or coming up with
responsibility for budget control then alternative methods, systems or
performance cannot be attributable. encourage staff or managers to
shift problems over to other procedures to improve service

organisations or areas not being delivery. PIs should ideally be


Responsive measured or to allocate constructed to allow and encourage
57. A performance indicator should disproportionate resources to such innovation to take place.
be responsive to change. An indicator activities because they are being Indicators that focus on outcome
where changes in performance are measured. PIs should not be open and user satisfaction are more likely
likely to be too small to register will to easy manipulation, and use of to do this than indicators that are
be of limited use. This can be the case several counterbalancing indicators tied into existing processes.
particularly with qualitative (yes/no) will sometimes be necessary to
indicators, as progress towards discourage such behaviour. Allow innovation
achieving a ‘yes’ is not captured. This A contract for street cleaning was
monitored against a specification with
problem can sometimes be overcome Avoid perverse incentives a set of performance targets and
by using a number of ‘yes/no’ An example of a PI that may create indicators that listed each street, and
indicators together to give a picture perverse incentives is the indicator, said how frequently it was to be
‘The average waiting time for calls to cleaned and the type of cleaning to
of progress, or by converting a yes/no be answered’, which was used to be carried out. This did not allow the
indicator to a numerical one by monitor the performance of an contractor to vary the service to be
organisation’s switchboard. In order delivered to reflect the extent to
asking ‘What proportion of the
to process calls quickly, operators took which streets became littered. A
processes meet the “yes” criterion?’ less care over making sure that they better specification would define the
put the caller through to the right standards of cleanliness to be
extension, and the level of achieved, and the speed of response
Responsive to a report of littering, and allow the
misdirected calls rose. A combination
The PI, ‘The percentage of staff in of counterbalancing PIs could limit contractor to deliver the service in the
departments complying with Investors this perverse incentive: ‘The average most effective way available.
in People (IIP) standards’ or ‘The waiting time for calls to be answered’
proportion of yes answers to a list of and ‘The percentage of calls that are
the detailed criteria involved in IIP’, put through correctly’.
would be responsive and enable
progress to be identified. By contrast, Another example of a PI that may not
the indicator, ‘Has the organisation always have desirable effects is ‘The
achieved IIP status?’, is a single yes/no percentage of trains arriving within
PI, where only a major change in 10 minutes of scheduled time’. Such
performance (from no to yes) can be an indicator could result in the
tracked. situation, where two trains that will
share the same line are late, that the

19
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 20

M A N A G E M E N T P A P E R • O N T A R G E T

Statistically valid to be aware of the risk of basing in one or two criteria. Less than
decisions on data that are out of ‘perfect’ indicators can, however,
60. Indicators should be statistically
date and no longer accurate. represent a valid starting place if
valid. Performance indicators based
refinements are carried out when
on a small number of cases are
Assessing the importance more insight into the area is gained.
likely to show substantial annual
of the criteria
fluctuations. In these instances, it 63. What is crucial is thus that the
62. In practice it can be difficult to
should be considered whether a indicator is developed in the
devise a performance indicator that
performance indicator is the right context of the use and users
fulfils all the criteria precisely, and
method to gauge the performance [EXHIBIT 5].
trade-offs will often be necessary.
development or whether a larger
Many PIs are likely to score less well
sample size is possible.

Statistically valid
The definition of the best value PI,
‘The percentage of undisputed EXHIBIT 5
invoices which were paid in 30 days’, Types of uses
states that it is to be based on an
analysis of all invoices, or of a The matrix illustrates the use of different types of performance indicators according to
representative sample of at least 500 the public/internal and national/local dimensions.
invoices. This will provide a good
degree of accuracy.
National Local I
In contrast to this is the PI, ‘The
number of deaths due to fire in a Public National publication against Publishing local standards
brigade area in any one year’. The targets or in ‘league’ tables. and targets, in order to
number could be small in some These indicators may also be enhance accountability and
brigades and therefore subject to used to demonstrate to inform the local
accountability. community.
quite random fluctuations from year
to year. A five-year moving average Internal Comparisons and Use of PIs as part of the local
would reveal trends more robustly. benchmarking between management process, and to
organisations, to improve monitor the achievement of
service performance. local political objectives.
Timely
Source: Audit Commission
61. The PI should be based on data
that are available within a
reasonable time-scale. This time-
scale will depend on the use made
of the data. Some data are collected
on a weekly or even daily basis, as
they are needed in the operational
management of the service, I ‘Local’ is here used to signify PIs that
whereas others are available once a are set as a supplement to any national
year for more strategic and long- PIs by NHS trusts, health authorities,
term purposes. Organisations need government agencies and local
government.

20
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 21

C R I T E R I A F O R R O B U S T P E R F O R M A N C E I N D I C A T O R S

64. Public-national use of indicators 68. When criteria are not met, it is
demands that indicators are clearly essential to be aware of the
defined, comparable, verifiable, implications for the collection and
unambiguous and statistically valid. interpretation of the data. Extra
In order to achieve these qualities it training of staff or better
may be necessary to compromise on communications with stakeholders
some other factors. may, for example, be required.

65. Indicators that are published to


the local community (public-local
use) should first and foremost be
relevant and easy to understand.
They should be unambiguous to
avoid confusion, or at least they
should be clearly interpreted.
Comparability over time could be
more important than comparability
between organisations.

66. Indicators used for


benchmarking (internal-national)
can often be useful even if they
score less well on some of the
criteria, as the benchmarking
process can involve discussion and
more detailed exploration of
differences in definition and
measurement.

67. Indicators used internally


(internal-local) should be relevant
to local corporate objectives. They
should also be responsive to
changes in performance by the
authority, and if used for
management, they should also be
timely.

21
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 22

M A N A G E M E N T P A P E R • O N T A R G E T

6. Making indicators work


Controlling the number of good working rule is a set of 10 to deliver essential knowledge for the
indicators 20 indicators for any one reader or user to be able to understand the
individual manager. The number context of the service and to
69. The challenge when developing
will depend upon the position of interpret the performance figures.
PIs is to balance the number of
this individual, the complexity of
performance indicators against the
the area and the intended use of Report only indicators that
overall ability to describe the
the PIs. There are a number of ways are relevant to the user
service. The number of indicators,
to reduce the number of indicators. 74. Some organisations include
however, depends on what is
every indicator in their performance
appropriate for the target group
Develop ‘action-focused’ reports, regardless of whether it is
and context in question. In some
indicators used at an operational or strategic
cases, where a stakeholder is
72. Performance information level. Indicators should be tightly
interested in several complex
systems often accumulate focused on the user’s information
services or where there are many
unnecessary indicators on the needs. If the indicators suggest that
different stakeholders, it might
assumption that ‘it would be nice to there are problems in service
prove difficult to keep the number
know about…’. Such indicators delivery, users should demand
of indicators down.
distract attention from the additional, lower-level information
70. It is important that the set of important issues facing a service. An on an exception basis, rather than
indicators is chosen carefully, as it is important question that should be receive it routinely.
easy to produce a portfolio with so applied to every indicator is ‘What
many indicators that it becomes action could the recipient of the Use composite or
hard to focus on the true information take on the basis of illustrative indicators
performance of the organisation. this information?’ Indicators where 75. Some performance information
Organisations need to keep the the answer to this question is that systems use a score developed from
overall number of indicators to a the recipient would never take adding several indicators together
manageable number if they are not action on the basis of the into a composite indicator. The
to be swamped with information. information should not normally be advantages of such indicators are
71. Most public service in the portfolio. that they can present an overall
organisations provide a variety of picture of a service area, making it
73. This criterion particularly relates
services, so although there may be easier to get a quick overview of
to ‘context’ information.
only a handful of indicators for a the performance.
Information such as ‘the proportion
single service, the overall portfolio of the local population that is over
could include over a hundred 75’ should thus not be confused
indicators. The CIPFA report, with a real indicator of
Measuring Up (1998), argues that a performance. It may, however,

22
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 23

M A K I N G I N D I C A T O R S W O R K

76. An important issue when assessments, for example, are Regular refinement of
constructing composite indicators is weighted 2.5, research weighted 1.5 indicators
to decide the weights that are and the remaining indicators 1.
80. Public services are undergoing
applied to each indicator in the
79. An alternative approach to continual changes due to both
formula. Ascribing weights to
developing a composite indicator is internal and external factors or in
different functions can be difficult
to identify the most important response to users’ demands. It is
and contentious, and the
aspect of the service and report a important that performance
interpretation of composite
single indicator to illustrate the indicators react to these changes.
indicators can mask differences in
performance of an organisation in Change might occur if political
performance of the areas clustered
that service area. The assumption priorities have changed, the
together.
behind such illustrative indicators is demand for services has changed, or
77. The NHS Performance that if the organisation is good at a programme or development has
Assessment Framework high level this aspect of the service, it is likely been completed. Changes or
indicators include a composite to be good at the others as well. additional indicators may also be
indicator of potentially avoidable However, it carries the risk of necessary if the indicators originally
deaths from ten different distorting service delivery, as most chosen are found to be flawed, or
conditions. No weighting has been of the effort might be put into this new and better data become
applied to the different conditions. area. This type of indicator can be available.
An example of a composite acceptable provided that the user is
81. Organisations also need to
indicator where weighting has been satisfied that a balanced set of
respond if the performance
used is ‘the average cost of indicators is being monitored at
indicators suggest that objectives
handling a Housing Benefit or another level in the organisation.
are not being met, by developing
Council Tax Benefit claim, taking For example, an illustrative
action plans that will require
into account differences in the indicator within best value for
performance indicators to monitor
types of claim received’, where case trading standards is ‘the average
their implementation. The
costs are weighted by caseload mix. number of consumer protection
performance measurement system
This is an example of a relatively visits per high and medium risk
should report not only indicators of
simple composite indicator, as the premises’. This indicator captures an
performance but also incorporate
factors do not vary much in nature. important aspect of trading
an evaluation and review process to
standards, but does not reflect a
78. Another example of a consider whether it is measuring
number of other activities.
composite indicator, where the right things in the right way.
different aspects of performance But indicators should not be
are added together, is the university amended too often, otherwise long-
league table published yearly by term trends and comparisons will be
The Times newspaper. The main lost.
indicator determining the ranking
of the universities is composed of
eight indicators to which different
weights are applied. Teaching

23
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 24

M A N A G E M E N T P A P E R • O N T A R G E T

Setting targets • National, regional or family Avoid common pitfalls


targets, which are set for a
82. Setting targets can be a helpful 84. Box E summarises some of the
demographic and/or service
method to challenge the pitfalls when developing
area.
organisation or a specific service performance indicators and
area to do better. Targets can 83. It is crucial that the targets are suggests some ways in which the
provide a forward-looking realistic (not constituting a ‘wish difficulties may be overcome.
perspective and information on not list’) but at the same time
only the level of activity of a service challenging for the organisation
but also on whether objectives are and its individuals. The targets
being achieved. should in other words be SMART,
that is, Specific, Measurable,
Targets could be based on:
Achievable, Relevant and Timed.
• political priorities
Box D lists a number of examples of
• community and customer good and bad targets.
priorities or concerns
• previous performance
• internal comparison with other
units within the organisation BOX D

• external comparison to identify Examples of good and bad targets


good practice (either with other
Examples of good targets:
public organisations or with
private sector organisations). ‘We will reduce the number of missed household collections by 10 per cent
by next year.’
Targets can be: ‘We will aim to collect 95 per cent of the council tax which is due next year.’
• All-the-time targets, which ‘We will increase the number of visits to local libraries by 20 per cent before
promise the level of service to the end of 2001.’
be delivered all the time. ‘We will cut the number of unfilled places in primary schools by 10 per cent
by 31 December 2000.’
• Percentage achievement
targets, which are commitments Examples of poor targets:
to achieve a stated level of ‘We will improve the way we handle complaints.’
performance against a
‘We will buy as many books for the schools as possible.’
standard.
‘We aim to have the best bus service in the region.’
• Qualitative targets, which are
‘We aim to increase co-operation between school and police authorities.’
descriptive targets of what level
‘We will answer 75 per cent of all letters within 5 days’ (a poor target if the
of service to expect. remaining 25 per cent take 3 months to answer).
• Time-bound targets,
Source: Audit Commission
constituting a one-off promise
for a certain area.

24
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 25

M A K I N G I N D I C A T O R S W O R K

BOX E

Common pitfalls when setting up performance indicators and how to avoid them

Pitfalls And how to avoid them…


PIs that measure activity rather than performance will A focus on the key objectives of the organisation will
provide less useful data and information overload. keep attention on the essential goals. From these key
objectives, it is important to align indicators to the more
operational levels.
Focusing on short-term targets at the expense of long- The balanced scorecard approach can help to ensure the
term objectives is a risk, due to pressure for immediate inclusion of both long- and short-term objectives.
good performance.
Lack of understanding of outcome measures might lead It is worth spending time on developing good outcome
to this type of PI being underused. measures, though this is not an easy task. The ripple
effect can be a helpful method. Measures of processes
associated with good outcomes may also be used if
outcome measures are not available.
Too many financial measures compared with quality The balanced scorecard or a similar approach should be
measures can lead to skewed performance and neglect of considered to ensure the right balance.
essential areas.
Manipulation of data to improve the measured Perverse incentives can be minimised by setting up
performance is a risk especially when performance is counterbalancing PIs, verification of data and by
published, ownership of the indicators is weak, or staff involving staff in the construction of indicators.
reward and censure depend on the indicators.
Danger of specifying data because they may be Again a focus on the key objectives of the service or
interesting rather than needed. function can reduce the risk of ending up with ‘nice to
know’ rather than ‘need to know’ indicators. But
organisations should recognise the possible need for
context indicators.
Risk of measuring job processes that are easy to measure Focus on key objectives and a cascading down to more
rather than those that have the greatest potential value, operational measures can improve the insight into the
for example, routine work vs. research projects. valuable processes of the organisation.
Not targeting the PIs on the relevant stakeholder Stakeholder analyses and clear information and
groups will often lead to the information not being used. communications strategies can improve the targeting of
PIs to stakeholders by understanding their needs. Clarity
of purpose is achieved.
Not comparing like with like can lead to feelings of Data quality must be high and consensus established on
unfairness and lack of trust in the performance measures. the principles on which comparison is based. Trust can be
enhanced by using PIs intelligently, to prompt questions
rather than to jump to conclusions.
Not understanding user needs may lead to the wrong PIs Stakeholder analysis can again provide a useful tool.
being collected and efforts put in the wrong areas.
Not revising the system in response to internal and Regular refinement of individual indicators and the set of
external changes may lead to an outdated system not indicators should be included in the evaluation and
measuring the significant things and possibly sending the review system of the organisation.
organisation in the wrong direction.
Source: Audit Commission

25
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 26

M A N A G E M E N T P A P E R • O N T A R G E T

7. Conclusion: what to do next?


85. Setting up a portfolio of 86. ‘The perfect PI’ is seldom
performance indicators is a key part created overnight and will often
of a good performance need to be improved and refined,
measurement system. Creating a as more experience and insight are
balanced picture of the service and gained. It is important that efforts
focusing on the priorities of the should be made to come up with
organisation are not easy tasks, but good, solid indicators, and that an
achieving an understanding of the ‘easy’ option in relation to the
users and uses of performance development and choice of PIs is
indicators is a crucial first step in not slipped into. Organisations
this process. should also look for alternative
ways of assessing performance in
those cases where it proves difficult
to develop quantifiable indicators.
But at the same time, it is important
to get started with the work of
developing PIs. Planning and
preparation, experienced staff and
motivated management should be
allocated to the creation of a set of
robust performance indicators.

26
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 27

B I B L I O G R A P H Y

Bibliography
Better by Far: Preparing for Best Performance Management
Value, Audit Commission, 1998 Framework for NHS Wales, National
Assembly for Wales, 2000
Management Information. Can You
Manage Without It?, CIPFA, 1997 Performance Review
Implementation Guide, Audit
Managing Services Effectively,
Commission, 1990
HMSO/Audit Commission, 1989
Planning to Succeed: Service and
A Measure of Success: Setting and
Financial Planning in Local
Monitoring Local Performance
Government, Audit Commission,
Targets, Audit Commission, 1999
1999
Measuring Up, CIPFA, 1998
Putting Quality on the Map,
The Measures of Success: HMSO/Audit Commission, 1993
Developing a Balanced Scorecard to
Quality and Performance in the
Measure Performance, Accounts
NHS: High Level Performance
Commission, 1998
Indicators, NHS Executive,
Modernising Government, White Department of Health, June 1999
Paper, March 1999
Wiring it Up – Whitehall’s
Performance Indicators for Management of Cross-cutting
2000/2001. Consultation document Policies & Services, Performance and
produced by DETR and the Audit Innovation Unit, January 2000
Commission on Best Value and Local
Authority Performance Indicators
for 2000/2001.

27
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 28

O N
T A R G E T
Performance measurement, including the use of
performance indicators, is an essential tool for improving
public services. But the full benefit of using performance
indicators will be achieved only if the indicators are
devised carefully, and used appropriately. This paper, and
its companion paper, Aiming to Improve: The Principles of
Performance Measurement, are based on the lessons
learnt from the development and use of performance
indicators by the Commission and other experts in the
field over the past decade.

This paper is written for managers who are responsible for


devising performance indicators. The companion paper
provides an overview of performance measurement for
policy makers and senior managers.

Further copies are available from:


Audit Commission Publications
Bookpoint Ltd
39 Milton Park
Abingdon
Oxon OX14 4TD
Telephone: 0800 502030

A U D I T
£15.00 net

Audit Commission C O M M I S S I O N
1 Vincent Square, London SW1P 2PN
Telephone: 020 7828 1212 Fax 020 7976 6187
www.audit-commission.gov.uk

You might also like