Target: The Practice of Performance Indicators
Target: The Practice of Performance Indicators
M A N A G E M E N T P A P E R
on
target the practice
of performance
indicators
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 2
Auditors are appointed from District Audit and private accountancy firms
to monitor public expenditure. Auditors were first appointed in the 1840s
to inspect the accounts of authorities administering the Poor Law. Audits
ensured that safeguards were in place against fraud and corruption and
that local rates were being used for the purposes intended. These founding
principles remain as relevant today as they were 150 years ago.
Public funds need to be used wisely as well as in accordance with the law,
so today’s auditors have to assess expenditure not just for probity and
regularity, but also for value for money. The Commission’s value-for-money
studies examine public services objectively, often from the users’
perspective. Its findings and recommendations are communicated through
a wide range of publications and events.
O N T A R G E T
Contents
1
Introduction 5
5 Criteria for robust
performance indicators 16 6 Making indicators work
Relevant 16 indicators 22
Bibliography 27
4
Types of indicators 11
Cross-cutting indicators 14
3
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 4
M A N A G E M E N T P A P E R • O N T A R G E T
First published in June 2000 by the Audit Commission for Local Authorities
and the National Health Service in England and Wales, 1 Vincent Square,
London SW1P 2PN.
4
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 5
I N T R O D U C T I O N
1. Introduction
1. The introduction of best value management of services. This is Improve: The Principles of
in local government, the described more fully in the Audit Performance Measurement. These
Performance Assessment Framework Commission paper Aiming to principles are listed in Box A.
(PAF) in the NHS and in social
services, the NHS Wales
Performance Management BOX A
Framework (PMF), and Public
Service Agreements (PSAs) in central The six principles of a performance measurement system
government have emphasised the • Clarity of purpose. It is important to understand who will use
information, and how and why the information will be used.
importance of having good
Stakeholders with an interest in, or need for, performance information
performance indicators (PIs) as part should be identified, and indicators devised which help them make
of performance management in the better decisions or answer their questions.
public sector. To complement the • Focus. Performance information should be focused in the first instance
on the priorities of the organisation – its core objectives and service
national sets of PIs, public sector areas in need of improvement. This should be complemented by
organisations are expected to set information on day-to-day operations. Organisations should learn how
indicators affect behaviour, and build this knowledge into the choice
their own PIs and targets, and to
and development of their performance indicators.
monitor and publish their • Alignment. The performance measurement system should be aligned
performance. with the objective-setting and performance review processes of the
organisation. There should be links between the performance indicators
2. The Commission has extensive used by managers for operational purposes, and the indicators used to
experience of developing and using monitor corporate performance. Managers and staff should understand
and accept the validity of corporate or national targets.
performance indicators. This paper
• Balance. The overall set of indicators should give a balanced picture of
on The Practice of Performance the organisation’s performance, reflecting the main aspects, including
Indicators sets out the lessons that outcomes and the user perspective. The set should also reflect a balance
between the cost of collecting the indicator, and the value of the
have been learnt. It is aimed at
information provided.
helping managers and practitioners • Regular refinement. The performance indicators should be kept up to
in local government, the NHS and date to meet changing circumstances. A balance should be struck
central government to develop their between having consistent information to monitor changes in
performance over time, taking advantage of new or improved data, and
own set of balanced and focused reflecting current priorities.
indicators. It describes ways to • Robust performance indicators. The indicators used should be
ensure that PIs are robust and well sufficiently robust and intelligible for their intended use. Independent
scrutiny, whether internal or external, helps to ensure that the systems
framed.
for producing the information are sound. Careful, detailed definition is
essential; where possible, the data required should be needed for day-
3. Indicators should be used within
to-day management of the services.
a wider framework of performance
Source: Audit Commission
measurement systems, performance
management and overall strategic
5
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 6
M A N A G E M E N T P A P E R • O N T A R G E T
6
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 7
T H E D I F F E R E N T U S E R S A N D U S E S O F P E R F O R M A N C E I N D I C A T O R S
EXHIBIT 1
The different users and uses of indicators
Indicators should form a coherent set, with operational indicators supporting the
publication of local and national indicators.
Users Use
Managers
and staff
Management Day-to-day
information indicators management
7
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 8
M A N A G E M E N T P A P E R • O N T A R G E T
Focus on the right topics Economy, efficiency and effectiveness link inputs to outcomes.
8
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 9
D E V E L O P I N G E F F E C T I V E P E R F O R M A N C E I N D I C A T O R S
The ‘Three Es’ Effectiveness: ‘having the performance indicators have the
13. One common way of organisation meet the citizens’ category fair access).
developing performance indicators requirements and having a
17. A related approach to ensure a
is to use the three dimensions of programme or activity achieve its
balanced representation of the
economy, efficiency and established goals or intended aims’.
service is the differentiation
effectiveness [EXHIBIT 2]. These have An example is ‘the percentage of
between quality, cost and time
been central to the Audit library users who found the
[BOX B]. Cost reflects the financial
Commission’s work. book/information they wanted, or
side of the organisation’s activities,
reserved it, and were satisfied with
14. The basic measures when quality captures the features of a
the outcome’. Effectiveness is about
constructing the three Es are: service and their appropriateness
assessing whether the service is
for the user, and the time aspect
• cost: the money spent to actually achieving what it set out to
covers the responsiveness and speed
acquire the resources; do.
with which services are delivered.
• input: the resources (staff,
16. Some organisations add a Not all stakeholders will find these
materials and premises)
fourth E, equality or equity. This dimensions equally relevant, but
employed to provide the
dimension captures the degree to together cost, quality and time can
service;
which access to services is equitable, provide a simple way to ensure
• output: the service provided to and whether the services are balanced coverage in relation to a
the public, for example, in appropriate to the needs of all service area. See Measuring Up,
terms of tasks completed; and those who should be able to use CIPFA, 1998, for further
• outcome: the actual impact and them. This has been incorporated in information.
value of the service delivery. several measurement frameworks
(for example, both the NHS
15. The three dimensions of
Performance Assessment Framework
performance are defined as follows:
and the set of best value
Economy: ‘acquiring human and
material resources of the
appropriate quality and quantity at
the lowest cost’ (staff, materials, BOX B
premises). An example is the cost of
buying new books for a library. A rounded set of indicators
A balanced set of indicators can be constructed by using cost, time and
Efficiency: ‘producing the maximum quality measures.
output for any given set of resource Example: Average speed of processing a new housing benefit claim
inputs or using the minimum inputs (timeliness)
for the required quantity and Average cost of a claim (cost)
quality of service provided’. An Percentage of cases where the calculation of benefit was
correct (service quality)
example is the cost per visit to
public libraries. Source: Adapted from Measuring Up, CIPFA, 1998
9
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 10
M A N A G E M E N T P A P E R • O N T A R G E T
10
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 11
T Y P E S O F I N D I C A T O R S
4. Types of indicators
Creating outcome service need to be established asking ‘has the problem been
measures (moving from the inner to the outer resolved, or the benefit been
circles). delivered?’ Answering the question
22. Outcome measures are crucial
may be difficult and require surveys
to monitoring the achievement of 24. Outcome measures – measures
or detailed one-off investigations.
service objectives. Having of effectiveness – depend on a clear
determined the objectives, it can, understanding of what the service is 25. Outcomes of services may also
however, be difficult to devise seeking to achieve. Economy and take a long time to emerge, such as
suitable outcome measures, efficiency indicators can usually be the impact of services on quality of
especially in complex areas such as constructed quite simply by looking life. The full benefits of pre-school
social services and the health at costs and at resource education may not be apparent
service. deployment. But to see whether the until much later in life. It may be
service is effective means going possible to monitor the service only
23. The ‘ripple effect’ approach can
back to the original problem, and by looking at process measures,
help to identify links between
objectives and outcome measures
for a service [EXHIBIT 3]. Outcome
measures can be quite difficult to
construct, as outcomes may be long EXHIBIT 3
term and influenced by various The ‘ripple effect’ approach
factors. A service could be viewed as A simplified example of how the ripple effect links objectives with outcome measures: the
a stone tossed into water, creating outer circle contains the less measurable outcome elements of the service (people able to
have a reasonable standard of living), the inner circle the more measurable indicators and
several ripples of effect around it.
targets (benefits paid accurately and on time).
The outer ripples reflect the
overarching objectives of the
service, and the inner ripples the
more measurable aspects of the
service. This process can be used in
two ways. First, it can be used to
Peo to
p ed
Benefits paid
le
accurately and
nt
re
11
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 12
M A N A G E M E N T P A P E R • O N T A R G E T
such as the proportion of children 29. The Audit Commission it is important to bear in mind that
receiving pre-school education, as developed a concept of quality in quality is multidimensional.
these can be measured in a more Putting Quality on the Map, 1993, Providing a quality service means
timely way. Process indicators thus to reflect the fact that, in the public balancing the different factors of
measure the activities that will lead sector, financial constraints and the service standards, delivery and costs.
to the final results. To use a subsidy given to many services may The quality map [EXHIBIT 4]
‘substitute’ of this kind depends on mean that it is not always possible incorporates the following
evidence that process and outcome to satisfy users’ needs. Additionally, elements:
are linked or at least on a general
consensus that the process
contributes to the outcome.
Research may be needed to avoid EXHIBIT 4
process measures that lead the The quality map
organisation to the wrong The Commission emphasises the importance of communication as well as consistent
outcomes. delivery of appropriate standards, and economic use of resources.
12
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 13
T Y P E S O F I N D I C A T O R S
expectations through A metropolitan council sought to improve its housing service by renovating
its housing offices, improving the reception areas with carpeting and plants.
communicating with service It also provided facilities such as vending machines, toys for children and
users, advising them of the toilets. But a survey showed that these improvements had had little effect on
users’ satisfaction with the service. Further research showed that the aspects
standards that can be provided
of the service that had been improved were low on users’ priorities. What
within the resources available, users most wanted was to be able to see the same officer each time they
how services will be allocated or visited the housing offices. This issue had not been addressed by the council.
prioritised and incorporating Source: Audit Commission
this knowledge in the standards
of service specified. The
importance of understanding
stakeholders’ priorities is
• proxy measures of quality; 32. When basing a PI on consumer
further illustrated in Box C.
• qualitative (yes/no) indicators, surveys, it is important to have a
• Specification: questions the
for example, checklists of ‘good clear understanding of the target
standards of services that are
practice’ processes; and group(s) and whether they are
set. These should be based on
reached by the survey. The type of
balancing users’ needs and • professional assessment.
survey should also be considered:
wishes with the organisation’s 31. Quantitative indicators can in will it probe the users’ experience
resource constraints. some cases be devised to address of the service, or will it focus only
• Delivery: are the standards the quality aspect of a service. An on their satisfaction with the
delivered reliably? example is the ‘percentage of 999 service? For example, a survey on
• People & systems: are staff calls answered within local target library services may miss essential
trained to deliver services response time’. If paramedics reach information if it focuses only on the
effectively, and are they a heart attack victim quickly it can users and misses the non-users and
supported by good systems? save a life, and the police arriving the reasons why they do not visit
at a crime scene quickly can be libraries. Some guidance on how to
• Effective use of resources: does
important in reassuring victims and carry out user satisfaction surveys is
the service make good use of
following up incidents. Quantitative provided in Guidance on Methods
resources?
indicators can also address reliability of Data Collection published by the
30. A number of methods can be or accuracy issues: for example, ‘the Department of the Environment,
employed to gauge the quality of a percentage of trains running on Transport and the Regions, the
service: time’ and ‘the percentage of trains Housing Corporation and the
• quantitative indicators; cancelled’ measure the reliability of National Housing Federation in
• consumer surveys; the service. 2000, with the support of the Local
• number of complaints; Government Association,
Improvement and Development
13
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 14
M A N A G E M E N T P A P E R • O N T A R G E T
Agency and the Department of 35. Another tool for addressing there is a public benefit beyond the
Social Security. quality is the use of qualitative individual. It has been argued that
(‘yes/no’) indicators. A qualitative in these cases ‘objective measures’
33. The number of complaints can
indicator can be developed using a of the ‘right’ level of quality should
be used as an indicator of quality.
series of ‘yes/no’ questions to be established based on
An increasing number of complaints
describe quality. For example, the expert/professional opinion. But
about a service can be a valuable
way in which a planning application understanding services from the
indicator that problems are
is handled can have an impact on users’ perspective and developing
developing. However, such an
applicants’ perception of service user-focused indicators should never
indicator may be unsuitable for
quality. ‘Do you provide pre- be overlooked, and professional
comparing an organisation with
application discussions with assessment should be used in
another due to differences in
potential applicants on request?’, conjunction with user-focused
expectations about service levels. A
‘Do you have a publicised charter indicators.
low level of complaints could
which sets targets for handling the
indicate that service users have
different stages of the development Cross-cutting indicators
given up due to bad experience
control process?’, and ‘Do you 38. Some services can be
with the handling of complaints or
monitor your performance against significantly influenced by several
that access to the complaints system
these targets?’ cover processes that agencies. Indicators that measure
is difficult.
have been found to affect users’ the collective performance of all the
34. Proxy measures are used when perception of quality. By creating agencies are referred to as ‘cross-
more direct measures are either not such checklists of ‘good practice’ cutting’ indicators. A well-chosen
available or very difficult or processes, important quality issues cross-cutting indicator, which all
expensive to collect. One example is can be captured. parties accept as relevant, and
visits to museums, which can act as which all can influence, can have a
36. Sometimes people may not
an indirect measure of quality powerful effect on improving joint
have an explicit view of what they
where comparisons have to be working and achieving better
actually want from a service or
made over time. Another example service delivery. In some cases,
what it is reasonable to expect. For
can be in an area where consumers individual influence can be hard to
example, though patients will be
have a choice of two or more determine, and all the organisations
able to judge whether they feel
facilities, where the relative usage involved may need to accept ‘joint
better or not after a treatment, it
of each can be taken as an and several’ responsibility for
may be difficult for them to
indication of the way that people improving performance. In other
determine whether the treatment
have balanced the issues of the cases, it may be possible to allocate
could have been better. In such
quality of the facility, its accessibility responsibility to individual
cases, it might be relevant to base
and cost. If cost and accessibility are organisations more precisely.
quality standards on professional
similar, the differences in usage can
assessment and inspection. 39. An example of an issue that
form a proxy measure for the
involves several agencies is the need
relative quality of each facility. It is, 37. In other cases, the expectations
to reduce and prevent crime by
however, important to use such of individual citizens may differ
young people. A number of service
proxy measures with caution. from society’s expectations, because
14
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 15
T Y P E S O F I N D I C A T O R S
15
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 16
M A N A G E M E N T P A P E R • O N T A R G E T
16
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 17
C R I T E R I A F O R R O B U S T P E R F O R M A N C E I N D I C A T O R S
17
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 18
M A N A G E M E N T P A P E R • O N T A R G E T
18
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 19
C R I T E R I A F O R R O B U S T P E R F O R M A N C E I N D I C A T O R S
19
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 20
M A N A G E M E N T P A P E R • O N T A R G E T
Statistically valid to be aware of the risk of basing in one or two criteria. Less than
decisions on data that are out of ‘perfect’ indicators can, however,
60. Indicators should be statistically
date and no longer accurate. represent a valid starting place if
valid. Performance indicators based
refinements are carried out when
on a small number of cases are
Assessing the importance more insight into the area is gained.
likely to show substantial annual
of the criteria
fluctuations. In these instances, it 63. What is crucial is thus that the
62. In practice it can be difficult to
should be considered whether a indicator is developed in the
devise a performance indicator that
performance indicator is the right context of the use and users
fulfils all the criteria precisely, and
method to gauge the performance [EXHIBIT 5].
trade-offs will often be necessary.
development or whether a larger
Many PIs are likely to score less well
sample size is possible.
Statistically valid
The definition of the best value PI,
‘The percentage of undisputed EXHIBIT 5
invoices which were paid in 30 days’, Types of uses
states that it is to be based on an
analysis of all invoices, or of a The matrix illustrates the use of different types of performance indicators according to
representative sample of at least 500 the public/internal and national/local dimensions.
invoices. This will provide a good
degree of accuracy.
National Local I
In contrast to this is the PI, ‘The
number of deaths due to fire in a Public National publication against Publishing local standards
brigade area in any one year’. The targets or in ‘league’ tables. and targets, in order to
number could be small in some These indicators may also be enhance accountability and
brigades and therefore subject to used to demonstrate to inform the local
accountability. community.
quite random fluctuations from year
to year. A five-year moving average Internal Comparisons and Use of PIs as part of the local
would reveal trends more robustly. benchmarking between management process, and to
organisations, to improve monitor the achievement of
service performance. local political objectives.
Timely
Source: Audit Commission
61. The PI should be based on data
that are available within a
reasonable time-scale. This time-
scale will depend on the use made
of the data. Some data are collected
on a weekly or even daily basis, as
they are needed in the operational
management of the service, I ‘Local’ is here used to signify PIs that
whereas others are available once a are set as a supplement to any national
year for more strategic and long- PIs by NHS trusts, health authorities,
term purposes. Organisations need government agencies and local
government.
20
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 21
C R I T E R I A F O R R O B U S T P E R F O R M A N C E I N D I C A T O R S
64. Public-national use of indicators 68. When criteria are not met, it is
demands that indicators are clearly essential to be aware of the
defined, comparable, verifiable, implications for the collection and
unambiguous and statistically valid. interpretation of the data. Extra
In order to achieve these qualities it training of staff or better
may be necessary to compromise on communications with stakeholders
some other factors. may, for example, be required.
21
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 22
M A N A G E M E N T P A P E R • O N T A R G E T
22
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 23
M A K I N G I N D I C A T O R S W O R K
76. An important issue when assessments, for example, are Regular refinement of
constructing composite indicators is weighted 2.5, research weighted 1.5 indicators
to decide the weights that are and the remaining indicators 1.
80. Public services are undergoing
applied to each indicator in the
79. An alternative approach to continual changes due to both
formula. Ascribing weights to
developing a composite indicator is internal and external factors or in
different functions can be difficult
to identify the most important response to users’ demands. It is
and contentious, and the
aspect of the service and report a important that performance
interpretation of composite
single indicator to illustrate the indicators react to these changes.
indicators can mask differences in
performance of an organisation in Change might occur if political
performance of the areas clustered
that service area. The assumption priorities have changed, the
together.
behind such illustrative indicators is demand for services has changed, or
77. The NHS Performance that if the organisation is good at a programme or development has
Assessment Framework high level this aspect of the service, it is likely been completed. Changes or
indicators include a composite to be good at the others as well. additional indicators may also be
indicator of potentially avoidable However, it carries the risk of necessary if the indicators originally
deaths from ten different distorting service delivery, as most chosen are found to be flawed, or
conditions. No weighting has been of the effort might be put into this new and better data become
applied to the different conditions. area. This type of indicator can be available.
An example of a composite acceptable provided that the user is
81. Organisations also need to
indicator where weighting has been satisfied that a balanced set of
respond if the performance
used is ‘the average cost of indicators is being monitored at
indicators suggest that objectives
handling a Housing Benefit or another level in the organisation.
are not being met, by developing
Council Tax Benefit claim, taking For example, an illustrative
action plans that will require
into account differences in the indicator within best value for
performance indicators to monitor
types of claim received’, where case trading standards is ‘the average
their implementation. The
costs are weighted by caseload mix. number of consumer protection
performance measurement system
This is an example of a relatively visits per high and medium risk
should report not only indicators of
simple composite indicator, as the premises’. This indicator captures an
performance but also incorporate
factors do not vary much in nature. important aspect of trading
an evaluation and review process to
standards, but does not reflect a
78. Another example of a consider whether it is measuring
number of other activities.
composite indicator, where the right things in the right way.
different aspects of performance But indicators should not be
are added together, is the university amended too often, otherwise long-
league table published yearly by term trends and comparisons will be
The Times newspaper. The main lost.
indicator determining the ranking
of the universities is composed of
eight indicators to which different
weights are applied. Teaching
23
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 24
M A N A G E M E N T P A P E R • O N T A R G E T
24
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 25
M A K I N G I N D I C A T O R S W O R K
BOX E
Common pitfalls when setting up performance indicators and how to avoid them
25
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 26
M A N A G E M E N T P A P E R • O N T A R G E T
26
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 27
B I B L I O G R A P H Y
Bibliography
Better by Far: Preparing for Best Performance Management
Value, Audit Commission, 1998 Framework for NHS Wales, National
Assembly for Wales, 2000
Management Information. Can You
Manage Without It?, CIPFA, 1997 Performance Review
Implementation Guide, Audit
Managing Services Effectively,
Commission, 1990
HMSO/Audit Commission, 1989
Planning to Succeed: Service and
A Measure of Success: Setting and
Financial Planning in Local
Monitoring Local Performance
Government, Audit Commission,
Targets, Audit Commission, 1999
1999
Measuring Up, CIPFA, 1998
Putting Quality on the Map,
The Measures of Success: HMSO/Audit Commission, 1993
Developing a Balanced Scorecard to
Quality and Performance in the
Measure Performance, Accounts
NHS: High Level Performance
Commission, 1998
Indicators, NHS Executive,
Modernising Government, White Department of Health, June 1999
Paper, March 1999
Wiring it Up – Whitehall’s
Performance Indicators for Management of Cross-cutting
2000/2001. Consultation document Policies & Services, Performance and
produced by DETR and the Audit Innovation Unit, January 2000
Commission on Best Value and Local
Authority Performance Indicators
for 2000/2001.
27
practice-of-perf-indicators.qxd 2/6/00 3:40 pm Page 28
O N
T A R G E T
Performance measurement, including the use of
performance indicators, is an essential tool for improving
public services. But the full benefit of using performance
indicators will be achieved only if the indicators are
devised carefully, and used appropriately. This paper, and
its companion paper, Aiming to Improve: The Principles of
Performance Measurement, are based on the lessons
learnt from the development and use of performance
indicators by the Commission and other experts in the
field over the past decade.
A U D I T
£15.00 net
Audit Commission C O M M I S S I O N
1 Vincent Square, London SW1P 2PN
Telephone: 020 7828 1212 Fax 020 7976 6187
www.audit-commission.gov.uk