Performance indicators: good, bad, and ugly

advertisement
Performance indicators:
good, bad, and ugly
The report of the Royal Statistical
Society working party on performance
monitoring in the public services,
chaired by Professor Sheila Bird
“Performance monitoring done well is
broadly productive for those concerned.
Done badly, it can be very costly and not
merely ineffective but harmful and indeed
destructive - of morale, reputations and
the public services.”
Methodological rigour in selecting
indicators




Sample surveys should be designed, conducted and
analysed in accordance with statistical theory and
best practice
Admin data should be fully auditable
Concepts, questions, etc should be comparable and
harmonised where possible – conforming to national
or international standards as appropriate
Indicators should be precise and accurate enough to
show reliably when change has occurred
Definitions should be precise

Definitions of both indicators and targets
should be

Precise but practicable


Consistent over time


useful definitions should be given for all the key
concepts in the indicator or target
any changes to definitions or methods should be fully
documented
Unambiguous

there should be no possibility of disagreement about
whether progress is the indicator going up or down
Practitioners involved should have input



For targets to be ambitious but achievable, a
good understanding of both the practicalities
of delivery on the ground, and of the data, is
needed
To understand the practicalities of delivery,
practitioners should be consulted
Motivational but irrational targets may
demoralise
Monitor for perverse outcomes


Targets can lead to practitioners playing the
system rather than improving performance to
meet badly thought through targets
An example from the report:



An indicator for prisons is the number of “serious”
assaults on prisoners
“Serious” = proven prisoner-on-prisoner assault
The indicator would improve if prisons reduced
their investigations into assaults
Do not ignore uncertainty or variability



Insistence on single numbers as answers to
complex questions is to be resisted
Natural variability, outliers, recording errors,
statistical error (i.e. confidence intervals
around sample estimates), all need to be
considered
All need to be clearly presented
Do not set 100% targets


100% targets can lead to perverse outcomes,
demoralise when failure inevitably occurs,
and lead to disproportionate resources being
used
An example from the report:



“No patient shall wait in A&E for more than 4
hours”
This becomes irrelevant as soon as one patient
does wait more than 4 hours
A&E staff may have very sound reasons for
making a small number of people wait longer
Do not ignore the distribution



Performance Indicators are 1 number
Single number summaries of data can be
misleading
An example from the report:


“Number of patients waiting more than 4 hours”
The whole distribution needs viewing to
understand the indicator e.g. has progress been
achieved by getting most people seen in 3 hours
59 minutes but some not for 10 hours?
Do not mistake statistical significance for
practical importance
There can not be a difference of practical
importance if the difference is not statistically
significant (because the difference might not
be genuine – it could just be chance)
BUT
 A difference could be statistically significant
but not practically important (because
statistical significance can be achieved by
getting a huge sample size)

Consider not setting a target until data are
well understood


The statistical properties of an indicator will
be much better understood after one or two
rounds of analysis
It may therefore be sensible to wait before
setting a target
Document everything: Others should be
able to replicate procedures


All assumptions and methods should be fully
documented so that others can fully understand and
replicate results
A ‘PM Protocol’ should include:









Objectives
Definitions
Survey methods / information about data
Information about context
Risks of perverse outcomes
How the data will be analysed
Components of variation
Ethical, legal and confidentiality issues
How, when and where data will be published
The report is available on the RSS website
here:
http://www.rss.org.uk/PDF/Performance%20monitoring%20231003.pdf
Download