HDC Literature Review

advertisement
Real Time Feedback
of Patient Experience data
Background literature summary
6 May 2014
CONTENTS
BACKGROUND .................................................................................. 3
DOMAINS OF PATIENT EXPERIENCE ....................................................... 6
METHODS OF MEASURING PATIENT EXPERIENCE ................................... 11
Point of care data collection ......................................................................... 11
National data sources ................................................................................... 13
Response rates .............................................................................................. 14
Qualitative data ............................................................................................ 15
Presenting patient experience data .............................................................. 16
DEVELOPING QUESTIONS .................................................................. 18
Genuine blank slate....................................................................................... 18
The NZ 20 Picker Questions........................................................................... 18
Questions from the National Patient Satisfaction Survey .............................. 18
The Friends and Family Test .......................................................................... 19
General Practitioner Tools ............................................................................ 21
RCGP approved tools .................................................................................... 23
The Session Rating Scale ............................................................................... 25
General remarks on question design ............................................................. 27
THE IMPACT OF PATIENT FEEDBACK .................................................... 28
CONCLUSIONS................................................................................. 30
© CBG Health Research Limited
2
Background
CBG has been commissioned to develop and implement a system for collecting Real Time
Feedback (RTF) from users of mental health and addiction services. The project is focused
on services delivered in a community setting, and seeks feedback from consumers and their
family / whānau. The system is being developed and implemented in 7 trial “sites” in 20
different settings. Sites range in size from a small medical centre to large DHBs.
A key component of the RTF project is providing opportunities for consumers and family /
whānau to have input in to the development of the system. This summary provides a brief
overview of the material accessed on measuring patient experience to get to the point in the
Real Time Feedback project where sample questions can be proposed as a basis for initial
discussion at pilot sites.
The literature is vast and this is not a literature review. There are a number of useful reviews
already available and it was not intended that this work be repeated. Some key source
documents have been loaded onto project Sharepoint site for easy access, for example all
the relevant Kings Fund reports.
The Health Quality and Safety Commission (HQSC) has a workstream to develop indicators of
patient experience for DHBs to use. A National Patient Satisfaction Survey that had been in
place since 2000 had been discontinued after concerns from DHBs that the survey was
deficient in a number of respects, and needed to be revised. The HQSC commissioned a
report from KPMG “Report on the Development of Patient Experience Indicators” (“HQSC
Report”) to assist in developing an alternative framework for measuring patient experience.
The HQSC Report draws heavily on the work of the Picker Institute, the American Consumer
Assessment of Healthcare Providers and Systems (CAHPS) and the Dutch Consumer Quality
Index (CQ-Index). The report proposed basing patient experience measurement on four
domains, each with five indicators measuring drivers in each domain.
© CBG Health Research Limited
3
This work sits within a more general context of a requirement for an integrated performance
and incentive framework that recognizes shifts towards integrated care and collaborative
decision making.
The work of the HQSC contributes specifically to one component of the Triple Aim of:

Improved quality, safety and experience of care

Improved health and equity for all populations

Best value for public health system resources.
To advance the work on patient experience indicators the HQSC subsequently purchased a
license to a Picker Institute bank of questions and trialed a subset of them with DHBs and
other stakeholders. After receiving feedback on the questions the HQSC reduced the
question set further and has proposed 20 questions that are being trialed in a small number
of DHBs. It is proposed that DHBs start reporting using these questions in June this year.
It seemed sensible to start the Real Time Feedback project with the output of this
workstream as a baseline, as it is part of a proposed national programme, and to explore
whether the proposed framework can be used in the community setting.
Although community stakeholders were consulted during the development of patient
experience indicators, and the intention was to develop an indicator framework that could
be used for the whole health system, the HQSC report is oriented towards capturing DHB /
hospital inpatient experience.
There are a number of obvious ways in which an approach developed primarily for the
inpatient setting might not be well suited for collecting patient experience data in an
ambulatory care setting, for example, “on the way out” after a consultation with a GP or
counsellor, or after receiving a home based service.

In the ambulatory setting a consumer will often be providing experience feedback about
a specific item of service, rather than an episode of care. The “Coordination” of services
as a domain of patient experience is probably not as important as in an inpatient setting
or may not be relevant at all.
© CBG Health Research Limited
4

Consumers in ambulatory settings typically will have minimal time available to complete
surveys. In preliminary discussions with all 7 sites it is clear that the reduced 20 questions
proposed by the HQSC will probably be too many for a baseline patient experience tool.

In the ambulatory situation the consumer may have greater concerns about access issues
than in an inpatient setting. Indeed “access” was specifically excluded as a possible
domain of experience as it was already included in the HQSC “Outcomes Framework”
(mirroring the approach taken by the NHS), but for many consumers access issues are a
key part of the patient experience.
There is considerable overlap in the taxonomies of different health services research
disciplines, and there is an extensive body of research on “acesss” that overlaps with
“patient experience” research. Questions about being able to get appointments within a
reasonable time , at an affordable cost or outside working hours are central to most people
experience of health care in the community. Most approaches to “access” consider the
concept to be multidimensional including cultural and organisational factors as well as
financial barriers and physical features of premises. Where “access” is measured in a system
performance framework is to some extent an arbitrary call.
This summary discusses source material on domains of experience, considers methods of
measuring them, looks at possible question sets, and some canvasses some caveats around
patient feedback.
© CBG Health Research Limited
5
Domains of patient experience
The importance of considering the patient experience in measuring and improving the
performance of health care systems has been long recognised, for example being included in
position statements from the Organization for Economic Cooperation and Development1
(OECD) and World Health Organization2 (WHO) and the Declaration of Alma Ata3.
In the last 15 years there has been a concerted effort to develop frameworks for measuring
patient experience as planners and policy makers have looked for tools that measure patient
experience in a way that can result in action. There has been a shift from measuring patient
satisfaction, which is heavily confounded by patient demographics, their previous
experiences and expectations, to measuring patient experience. Patient experience is
measured by asking about what happened, as opposed to asking patients what they felt
about a service. This information can be more easily used to improve the quality of health
care as providers know more clearly what they are not doing well.
The work in developing frameworks for measuring patient experience has also been driven
by a desire to take a more systems approach to health care delivery. Shared theoretical
frameworks build sector wide understanding of systems strengths and weaknesses,
facilitating a focus on the performance of the health system as a whole, and not just on
individual services. A shared framework may also have shared indicators, and these provide
opportunities for benchmarking, provided adequate adjustment for known confounders has
been implemented.
1
Hurst J, Jee-Hughes M. Performance Measurement and Performance Management in OECD Health
Systems. vol. 47. OECD Publishing; 2001.
2
Murray CJ, Frenk J. A framework for assessing the performance of health systems. Bull World Health
Organ. 2000;78(6):717–731
3
Primary Health Care: report of the International Conference on Primary Health Care, Alma-Ata, U.S.S.R., 6-12
September, 1978. Geneva: W.H.O, 1978
© CBG Health Research Limited
6
A number of patient-centred healthcare models have been developed which set out core
components of ‘patient experience’. Early work from Donabedian 4, Cleary5 and Ware6,
while developing models of health behaviour and patient satisfaction, established analytical
models for the measurement of patient experience
The Institute of Medicine (IOM) has developed a widely known model comprising six core
dimensions of patient-centred healthcare7: compassion, empathy and responsiveness to
needs, values and expressed preferences; coordination and integration; information,
communication and education; physical comfort; emotional support, relieving fear and
anxiety and involvement of family and friends. A wide range of quality measures can be used
to examine these components of patient experience, including Patient Reported Experience
Measures and Patient Reported Outcome Measures.
In the UK the Picker framework is more familiar and has been the basis for the original,
annual, national patient surveys in acute hospitals in England, introduced in 2001. Both the
IOM and Picker frameworks are informed by the same original research8 9. and are regarded
as technically sound, useful and widely recognised.
The Picker Institute has published eight “Principles of Patient Centred Care” 10:
4
Donabedian A. The quality of care. How can it be assessed? Jama. 1988;260:1743–1748.
5
Cleary PD, Edgman-Levitan S. Health care quality. Incorporating consumer perspectives.Jama. 1997;278:1608–
1612. doi: 10.1001/jama.278.19.1608.
6
Ware J. E., Jr., Snyder MK, Wright WR, Davies AR. Defining and measuring patient satisfaction with medical
care. Eval Program Plann. 1983;6:247–263.
7
Institute of Medicine Crossing the quality chasm: a new health system for the 21st century. Washington, DC:
National Academy Press, 2001.
8
Jenkinson C, Coulter A, Bruster S. The Picker Patient Experience Questionnaire: development and validation
using data from in-patient surveys in five countries. Int J Qual Health Care.2002;14(5):353–358
9
Jenkinson C, Coulter A, Reeves R, Bruster S, Richards N. Properties of the Picker Patient Experience
questionnaire in a randomized controlled trial of long versus short form survey instruments. J Public Health
Med. 2003;25(3):197–201.
10
http://www.pickereurope.org/our-mission-and-values.html
© CBG Health Research Limited
7








Respect for patients values, preferences and expressed needs
Coordination and integration of care
Information, communication and education
Physical comfort
Emotional support and alleviation of anxiety
Involvement of family and friends
Transition and continuity
Access to care
In addition to the IOM dimensions, the Picker framework includes ‘access’ and explicitly
identifies ‘transition and continuity’.
These “Picker Principles” have been subsequently tested and validated in a diverse range
settings. They are now widely accepted as a useful framework for conceptualising and
measuring patient experience, being utilised explicitly in design of patient experience
measurement systems in the NHS, the US and Australia (and now New Zealand). Research
from the Picker group has also shown (counter intuitively) that Picker questions may be
embedded in longer questionnaire with no apparent reduction in response rate or validity. 11
The HQSC report represents these principles by four “domains” of patient experience:
11
Jenkinson C1, Coulter A, Reeves R, Bruster S, Richards NJ Properties of
the Picker Patient Experience questionnaire in a randomized controlled trial of long versus short form survey
instruments. Public Health Med.2003 Sep;25(3):197-201.
.
© CBG Health Research Limited
8
Proposed domains of patient experience.
Domain
Description
Communication
Communicating and sharing information with patients,
consumers and families / whānau.
Partnership
Encouraging and supporting participation and collaboration in
decision making by patients, consumers and families / whānau.
Coordination
Coordination, integration and transition of care between clinical
and ancillary and support services across different provider
settings.
Physical and Emotional
Treating patients, consumers and families / whānau with dignity
Needs
and respect and providing the necessary physical and emotional
support
The Picker Principles are very widely accepted, but they can be operationalised in various
ways. The Picker Patient Experience Questionnaire (PPE-15) is a 15-item version of the NHS
Picker Adult In-Patient questionnaire, designed for use in inpatient care settings. One
criticism of this implementation is that the PPE-15 is its relative weakness in emotional
aspects of care, which are known to be important to patients12 A recent study proposed a
framework that regards patient experience of care as having three components:

functional’ aspects of care (such as arranging the transfer of patients to other
services, administering medication and helping patients to manage and control pain),
12
Robert G, Cornwall J, Brearley S, et al. What matters to patients; developing the evidence base for
measuring and improving the patient experience. A report for the Department of Health and NHS Institute for
Innovation and Improvement. Warwick: NHS Institute for Innovation & Improvement,
2011.http://www.institute.nhs.uk/images/Patient_Experience/Final%20Project%20Report%20pdf%20doc%20j
anuary%202012.pdf. (accessed 5/3/2012).
© CBG Health Research Limited
9


‘transactional’ aspects of care (in which the individual is cared ‘for’, eg, meeting the
preferences of the patient as far as timings and locations of appointments are
concerned) and
‘relational’ aspects of care (where the individual is cared ‘about’, eg, care is
approached as part of an ongoing relationship with the patient13).
The study showed that one instrument originally developed in Australia, The Patient
Evaluation of Emotional Care during Hospitalisation13 14 may strengthen understandings of
the relational aspects of patient experience in English hospitals, compared with the PPE-15.
This lends some support to idea of collecting free text feedback in patient experience data
collection.
13
Williams A, Kristjanson L. Emotional care experienced by hospitalised patients: development and testing of a
measurement instrument. J Clin Nurs 2009;18:1069–77.
14
Williams AM, Pienaar C, Toye C, et al. Further psychometric testing of an instrument to measure emotional
care in hospital. J Clin Nurs 2011;20:3472–82. [PubMed]
© CBG Health Research Limited
10
Methods of measuring patient experience
Point of care data collection
The Kings Fund has published a good review of methods of collecting data about patient
experience, included in the HQSC Report on the Development of Patient Experience
Indicators15. This project will use portable devices or kiosks. The strengths and limitations
are summarised below:
Online survey (email or
web-based)



Survey using hand-held
portable devices




Survey using touchscreen kiosks




15
User-friendly design –
questions can be tailored
to respondent and ‘skips’
avoided leading to better
item-response
completeness
Reminders are easy to
send
Data entry is automatic
allowing for rapid
turnaround of results

Used for on-site data
collection
Questionnaires easily
tailored to local setting
Automatic data entry
Rapid turnaround of
results possible
Used for on-site data
collection
Can be sited in waiting
rooms or clinics
Automatic data entry
Rapid turnaround of
results possible











Requires list of email addresses or
invitation to go to a website
Not suitable for people who do not have
internet access, so representative
coverage usually impossible
Questionnaire needs to be brief
Must take account of differences in
computer systems and browsers
Questionnaires must be brief
Attention must be paid to infection
control if patients are to handle devices
Someone must take responsibility for
the PDA devices and monitoring use
May be difficult to calculate response
rates
Questionnaires must be brief
Attention must be paid to infection
control if patients are to handle devices
Impossible to calculate response rates
because denominator is unknown
Hard to prevent multiple responses or
staff masquerading as patients
The document is available on the project website.
© CBG Health Research Limited
11
An additional limitation may be theft of devices in a busy waiting room.
In general the more time between experience and the consumer providing feedback the less
likely the consumer is to report negative experiences. For example, in a Swedish study of
patients perceptions of the care they received at a hospital outpatient clinic 16 a 10% drop in
satisfaction score was observed in patients did not complete assessments immediately after
the service was delivered:
16
Gasquet I1, Villeminot S, Estaquio C, Durieux P, Ravaud P, Falissard B. Construction of a questionnaire
measuring outpatients' opinion of quality of hospital consultation departments. Health Qual Life
Outcomes. 2004 Aug 4;2:43.
© CBG Health Research Limited
12
National data sources
It is worth mentioning that most countries collect some data about people’s experiences of
health services. The Australian Bureau of statistics data can be accessed on line, although
they are really just items of service counts:
http://www.abs.gov.au/ausstats/abs@.nsf/detailspage/4839.0.55.0012009
The NHS Health and Social Care Information Centre provides detailed access to real patient
experience data:
https://indicators.ic.nhs.uk/webview/
(GP Practice Data > Patient Experience). The user is redirected to results of the GP Survey.
Upon choosing a practice comparative is displayed:
© CBG Health Research Limited
13
This site also lets you explore the GP Patient Surveys’ results and following another path you
can access all indicators in the NHS Outcomes Framework. The below was produced
3/03/2014.
The New Zealand Health Survey collects data continuously from 400 households a week and
includes modules that ask about utilisation of health services and barriers to access,
including financial barriers and ease of getting appointments.
Response rates
The response rates to postal patient experience questionnaires varies enormously. In
discussion with DHBs during the project initiation phase, responses to mailed out surveys
© CBG Health Research Limited
14
varied from 16% to 40%. One research papers was published with response rates of 28%.
Recommendations for obtaining higher responses are that two reminders are given. The NHS
Friends and Family Test is expected to get at 15% RR. One of the highest responses rates was
from a New Zealand study 17. In mid-2009, Cancer Control New Zealand sent an
NRC+Picker postal survey to a stratified sample of 3251 eligible adults, who had received
outpatient cancer care between October 2008 and March 2009. They obtained a 68%
response rate, after intensive follow-up.
In a ambulatory setting where consumers can choose whether to complete a survey or not it
is difficult to calculate response rates although in smaller locations with single services they
can be estimated fair accurately if the number of consumers and number of responses are
known. The calculation would be complicated by more than one survey being completed for
one consumer, which might be quite legitimate as family / whānau are encouraged to give
feedback.
A fascinating study from Switzerland 18 showed that, even though incomplete participation
is of particular concern for surveys of patient perceptions of care (because patients who
have negative opinions may be least likely to participate) increasing participation from 30%
to 70% had only a modest influence on the final conclusions of the survey.
Qualitative data
There is a risk with quantitative survey approaches that consumers will not be able to give
the feedback they want to, simply because the question isn’t there. In fact there has been
increased activity in the area of collecting more in-depth data in interviews or focus groups,
17
O'Brien I, Britton E, Sarfati D et al The voice of experience: results from Cancer Control New Zealand's first
national cancer care survey. 2010 Nov 5;123(1325):10-9.
18 Perneger TV1, Chamot E, Bovier PA.NONRESPONSE BIAS IN A SURVEY OF PATIENT PERCEPTIONS OF
HOSPITAL CARE. Med Care. 2005 Apr;43(4):374-80.
© CBG Health Research Limited
15
including developing methodologies for “experience based co-design”19 (EBCD). In this
approach providers and consumers work out what the service should do together, using a
variety of research techniques. EBCD focuses on patient and staff experience and emotions
rather than attitudes or opinions, and uses storytelling, including videos to discover ways
that services can be improved.
There is potentially rich data that can be used to improve services that might not be
captured by simple questionnaires. Capturing data in interviews and focus groups is
expensive, but it is a simple matter to provide an option for free text input on mobile
devices.
Presenting patient experience data
In this project we will present data using either SAS Visual Analytics or a data presentation
tool purchased by CBG Health Research (YellowFin BI). A range of reporting options will be
explored with services to develop easy to interpret displays, but will drill down capability for
further data exploration.
To use data for benchmarking some degree of score adjustment may be required, to control
for known determinants of responses, In a study of Picker Problem Scores in Bern,
Switzerland 20 12.8% of the variation in hospital aggregate scores could be explained by
patient factors.
We will develop methods for adjusting for some consumer demographics as part of this
project, starting with presenting data for different population groups separately, when there
enough data to protect consumer identity.
19
http://www.kingsfund.org.uk/projects/ebcd/experience-based-co-design-description
20 BM Holzer1 and CE Minder. A simple approach to fairer hospital benchmarking using patient experience
data.
© CBG Health Research Limited
16
Factors
Number of
levels
Type of
factor
All factors tested
(10)
Proportion of variance
explained (%)
12.8
Self-reported
health
5
Patient
4.8
Age
3
Patient
2.2
Mode of
admission
2
Process
1.0
Education
5
Patient
0.7
Hospital
classification
4
Hospital
0.4
Service
department
3
Process
0.3
© CBG Health Research Limited
17
Developing questions
Genuine blank slate
The Real Time Feedback project seeks to genuinely engage consumers family and whānau to
collect information on the domains or experience they want to give feedback on, and indeed
on specific questions they think should be included. Views will be sought on question types,
presentation and number.
However to start discussion it was considered helpful to provide a sample set of questions,
with discussion guides making it clear that these were fro illustration only and that services,
consumers, family and whānau should not be constrained to consider just these questions.
The NZ 20 Picker Questions
As mentioned earlier the HQSC has developed a 20 item questionnaire for trialling in
selected DHBs, for potential national roll out. The questionnaire is attached as Appendix 1.
This questionnaire will be provided to pilot sites as part of their information pack, to give
further ideas of what sort of questions might be used.
Questions from the National Patient Satisfaction Survey
The HDC contacted the Ministry of Health and asked which questions from the discontinued
National Patient Satisfaction Survey might be useful for this project, in terms of relevance to
the context and potentially providing some continuity of measurement, albeit in different
circumstances. The suggested questions were:
© CBG Health Research Limited
18
Question number
Text
Q6.
Q12.
My opinions and ideas are included in my treatment plan.
Staff provided me with useful information about my illness and the
treatments available to me.
Staff provided my family with the education or support they need to be
helpful to me.
As a result of the service I have received, I have learnt new ways of
Q14.
Q17.
coping if I become unwell again in the future.
These are all answered with a 5 point scale from "strongly disagree" to "strongly agree"
Some of language here is institutional and may not be relevant or understandable to a
consumer of community based mental health and addiction services. The questions also
have a slight acquiescence bias, for example – in Q17 it is good to learn new things, so it is
harder to answer this question with a lower agreement rating.
The Friends and Family Test
In the UK the NHS has recently undertaken a large range of research on tools for measuring
patient experience 21 22.
As a result of reports from the Care Quality Commission (that inspects health care facilities)
and the inquiry into Mid-Staffordshire NHS Foundation Trust (very poor patient care that
caused a national scandal) the NHS decided to look for a method of rapidly finding out if
hospitals were failing to provide adequate levels of care. The outcome of that work was the
Friends and Family Test.
The Friends and Family Test asks a single question:
21
What Matters to Patients?’, Kings Fund, Sept 2011
22
Overarching Questions for Patient Surveys: Development Report for the Care Quality Commission’, National
Patient Survey Co-ordination Centre, Picker Institute Europe, June 2012
© CBG Health Research Limited
19
“How likely are you to recommend our <ward / A&E department> to friends and family if
they needed similar care or treatment?”
Key features of the implementation of the Friends and Family Test are:

Providers collect responses on a scale that goes from extremely unlikely to extremely
likely.

They are expected to add other questions that might help drive quality improvement
as well, but these are not set.

The question must be asked of anyone that has been admitted or attended ED. It is
not administered to patients of outpatient or day case services

There are no fixed requirements for response rates. However, it is expected that
responses will be received from at least 15% of the Trusts’ survey group. T

he question can be asked by postal survey, via online feedback or text messages or at
“kiosks” on site.

Feedback must be received within 48hrs.

The results must be available publicly down to the ward level
In 2012/13, the Friends and Family Test was introduced in a standardised format in the
Midlands and East region across all acute providers, and is now being rolled out to the whole
NHS.
Watemata DHB has implemented an inpatient experience questionnaire that closely follows
the NHS guidelines. The questionnaire is set up on a tablet PC and asks patients to identify
the ward or service, then start the survey. The survey assures that it is anonymous, then
collects basic demographic information in broad groups. The survey has the Friends and
Family Test and four questions:

Did we show care and respect

Did we meet your expectations

Did we see you promptly

Did we listen and explain
© CBG Health Research Limited
20
Responses to the Friends and Family test are on a vertical five point scale, Extremely Likely
to Extremely Unlikely, with a “Don’t know” option. Other responses are on a four point
scale, Excellent to Poor with smiley faces. There is a further free text option that asks
patients “Please tell us the main reason you gave us that score”.
The Waitemata DHB Friends and Family Test does not match the proposed HQSC domains,
and includes a specific access question. It is however the closest to the likely Real Time
Feedback survey in terms of simplicity, design and ease of use.
General Practitioner Tools
Perhaps the longest tradition of academic research on patient experience comes from
general practice / ambulatory care, a discourse that is relatively unrepresented in the HQSC
report. One large study reported differences across 8 European countries in 3650 patient
ranking of what they considered to be important in primary care 23.
Although there were interesting differences between patients in each country, some of
which could be explained by differences in health care systems, and others that appeared to
cultural / idiosyncratic, the result were surprisingly consistent across the eight countries.
Across all countries the most important thing was that doctors had enough time to listen talk
and explain.
The following table shows the top 10 items. The complete table of 38 items is available at:
http://fampra.oxfordjournals.org/content/16/1/4/T4.expansion.html
23
Grol R, Wensing M, Mainz J, Ferreira P, Hearnshaw H, Hjortdahl P, Oleson F, Ribacke M, Spenser T and
Szécsényi J. Patients' priorities with respect to general practice care: an international comparison. Family
Practice 1999; 16: 4–11.
© CBG Health Research Limited
21
Sweden
Portugal
Norway
Israel
The
Netherlands
Germany
Denmark
What would make for a good GP?
1
During the consultation a GP
should have enough time to listen,
talk and explain to me.
91
88
85
91
93
89
90
89
2
A GP should be able to provide
quick service in case of
emergencies.
88
89
89
94
88
87
91
80
3
A GP should guarantee the
confidentiality of information about
all his patients.
84
82
88
85
91
77
88
85
4
A GP should tell me all I want to
know about my illness.
85
84
89
82
76
69
84
85
5
A GP should make me feel free to
tell him or her my problems.
87
82
68
75
89
82
86
81
6
It should be possible to make an
appointment with a GP at short
notice.
74
74
69
84
86
77
81
86
7
A GP should go to courses
regularly to learn about recent
medical developments.
80
77
80
79
80
84
77
70
8
A GP should not only cure
diseases, but also offer services in
order to diseases.
73
76
79
64
82
86
79
79
9
A GP should critically evaluate the
usefulness of medicines and
advice.
79
79
74
79
74
75
66
74
10
A GP should explain the purpose
of tests and treatment in detail.
73
79
61
68
65
79
79
© CBG Health Research Limited
72
UK
Mean
ranka
22
RCGP approved tools
In 2012 the Royal College General Practitioners approved 6 tools as fit for purpose for
measuring patient experience for the purposes of GP “revalidation” (reaccreditation).

General Medical Council Patient Questionnaire

CFEP Improving Practice Questionnaire (IPQ)

EDGECUMBE 360o Version 2

CFEP Doctors’ Interpersonal Skills Questionnaire (DISQ)

Consultation Satisfaction Questionnaire (CSQ)

CARE Measure questionnaire
In all cases the tools consist of a small number of questions, similar to those of the GMC
Patient questionnaire. The CFEP tools can be purchased and completed on paper, the CSQ
questionnaire (from the University of Leicester) can be completed online by patients – using
a code provided to them by the GP.
The RCGP ‘Patient Satisfaction Questionnaire” PSQ is another widely used tool. All GP
registrars get feedback from at least 40 patients as part of their Membership exam. While
nominally a patient satisfaction tool it is more a patient experience tool, asking about how
well your doctor delivered specific aspects of the consultation. Feedback is collected on
paper forms and sent to the local branch of the College that then analyses them and
provides the resgistrar with a report. They are expected to then discuss the report with a
mentor.
This is used as a part of GP’s work based assessment in their Membership exam.
© CBG Health Research Limited
23
Patients are asked “How good was your doctor at…”
Making you feel at ease… (being friendly and warm towards you, treating you
with respect; not cold or abrupt)
Letting you tell “your” story… (giving you time to fully describe your illness in
your own words; not interrupting or diverting you)
Really listening… (paying close attention to what you were saying; not looking
at the notes or computer as you were talking)
Being interested in you as a whole person… (asking/knowing relevant details
about your life, your situation; not treating you as “just a number”)
Fully understanding your concerns… (communicating that he/she had
accurately understood your concerns; not overlooking or dismissing anything)
Showing care and compassion… (seeming genuinely concerned, connecting
with you on a human level; not being indifferent or “detached”)
Being positive… (having a positive approach and a positive attitude; being
honest but not negative about your problems)
Explaining things clearly… (fully answering your questions, explaining clearly,
giving you adequate information; not being vague)
Helping you to take control… (exploring with you what you can do to improve
your health yourself; encouraging rather than "lecturing" you)
Making a plan of action with you… (discussing the options, involving you in
decisions as much as you want to be involved; not ignoring your views)
How would you rate your consultation with this doctor today?
These question s are all answered on 7 point scale:
Poor to Fair / Fair / Fair to Good / Good / Very Good / Excellent / Outstanding
© CBG Health Research Limited
24
The Session Rating Scale
In early research for this project the Session Rating Scale24 , developed in the early 90s by
Lynn Johnson, was considered as a possible prototype for collecting patient experience data.
Its psychometric properties have been studied extensively25. In brief the scale was
developed by psychotherapists to measure the concept of “alliance”. Alliance is a key
determinant of therapy outcome. Regardless of any other interventions the strongest
predictor of therapeutic success, however defined, is the strength of the patient / therapist
alliance. In fact 54% of the variation in therapeutic outcomes can be explained by “alliance”
26.
As the authors say
“Putting this into perspective, the amount of change attributable to the alliance is about
seven times that of a specific model or technique.”
The research team had noted that, even though the importance of “alliance” was
indisputable with over 1000 pieces of published evidence, therapists did not measure it,
even though a number of tools were freely available. The key issue was length of the tools. A
19 item measure was available (, with good reliability and validity, but even in at risk
situations when “alliance” is most valuable, it was used by < 40% of therapists.
To address this need a 4 item tool was developed, illustrated on the following page. The
emphasis on patient centeredness is clear.
Having decided to take an approach to questionnaire development that leverages other
developments in the NZ Health Survey it is unlikely we will implement the SRS. The key
message from this research is the importance of having a small number of questions. In this
case the clinicians themselves did not want to use tools that consumers took too long to fill
out.
24 Johnson, L. D.,Miller, S. D., & Duncan, B. L. (2000). The Session Rating Scale 3.0. Chicago: Author.
25
Duncan B, Miler S, Sparks J et al. The Session Rating Scale: Preliminary Psychometric Properties of a
“Working” Alliance Measure. Journal of Brief Therapy. Volume 3, Number 1 Fall/Winter 2003
26 Wampold, B. E. (2001). The great psychotherapy debate: Models, methods, and findings. Hillsdale, NJ:
Lawrence Erlbaum.
© CBG Health Research Limited
25
© CBG Health Research Limited
26
General remarks on question design
There is also a more general literature on general principles of survey methodology and in
particular self-administered question design. One set of recommendations although
possible self-evident they are worth stating27.:





Use prominent visual guides
Ask one question at a time
Minimise cognitive effort of comprehension (very simple language)
Minimise cognitive effort of retrieval
Use visual elements consistently (brightness / colour / shape / position)
These principles will be used as much as possible in out self-administered questionnaire.
The fact that clients will often not speak or read English means that questionnaire should be
translated in to different languages.
27
Jenkins, C and Dillman D (1997). Towards a theory of self administered questionnaire design. In Survey
Measurement and Process Quality pp165-196 New York: Wiley
© CBG Health Research Limited
27
The impact of patient feedback
It has been assumed that collecting and providing patient feedback information will drive
improvements in health care, and there are many reports of specific institutions achieving
improvements in certain measures after the introduction patient experience data collection.
However at a systems level the evidence is sparse. The NHS Inpatient survey for Acute NHS
Trusts has been sampling 135,000 patients per year in all hospitals since 2002. In a paper
published last year 28 the authors note:
“Over the last 10 years, the surveys have detected improvements in the national results, but
the aspects of care that have improved are those that can be linked to national targets or
high-profile media campaigns (such as waiting times for inpatient admissions and ward
cleanliness). The results for most questions have stayed the same or, in some cases, have
deteriorated. For example, in 2011, only 53% of patients said their call bells were usually
answered within two minutes (10% worse than 2004) and 21% said nurses “talked in front of
them as if they were not there” (3% worse than 2002) There is little evidence that the
survey programme itself has driven any improvements in patients’ experiences.”
The problem may be, and this is supported by qualitative research, that results are not
presented to the actual providers of services, i.e. the ward staff. While mangers get reports,
ward staff may interpret results as not applying to them. There is also research showing that
clinician engagement with patient feedback data may be enhanced by including patients’
comments alongside numerical data29 30. . This may enhance the extent to which results
provoke a response from providers.
28 Rachel Reeves, 1 Elizabeth West,1 and David Barron2 Facilitated patient experience feedback can improve nursing care:
a pilot study for a phase III cluster randomised controlled trial BMC Health Serv Res. 2013; 13: 259.
29
Boyer L, Francois P, Doutre E, Weil G, Labarere J. Perception and use of the results of patient satisfaction surveys by care
providers in a French teaching hospital.Int J Qual Health Care. 2006
© CBG Health Research Limited
28
The authors also express some concerns about the expanded collection of patient
experience data, and in particular the new Friends and Family survey:
“Arguably, the surveys’ lack of impact could also be attributed to an over-emphasis on data
collection per se, rather than using results to improve the quality of care. The recent
promotion of “real-time” feedback will increase the volume of survey data collected by NHS
trusts while reducing the rigour of survey methods. Uniform data collection methods are not
required to conduct the “Friends and Family” survey, which covers all NHS inpatients and
emergency department attendees from April 2013 onwards, and there are no formal
mechanisms in place to ensure that the survey data are used for quality improvement.”
The Mid-Staffordshire NHS Foundation Trust scandal occurred after 10 years collection of
patient experience data. The system that was in place was obviously seriously flawed, and
this is what precipitated the development of the Family and Friends Test.
These experiences strongly support the importance of feeding information back to the
people that are delivering services, and further emphasise the importance of providing
narrative data back to providers, as this personalises the feedback and makes it less likely
results can be dismissed as mere statistics.
30
Reeves R, Seccombe I. Do patient surveys work? The influence of a national survey programme on local quality
improvement initiatives. Qual Saf Healthc.2008
© CBG Health Research Limited
29
Conclusions
After reviewing this background material some high level recommendations for the Real
Time Feedback Pilot can be made.

The number of questions should be small – probably in the 4-7 region.

Users of community services have concerns about access / timeliness (a component
of most primary care patient experience tools) - this could be a possible question.,
and indeed has been included in the Waitemata DHB Friend and Family Test

Addressing the four domains of experience in the HQSC report might not be the best
use of the small number of questions

There is real value in collecting text feedback – its gives voice to consumers in areas
that might not be covered in questions and increases the chance providers will act on
feedback

The process by which feedback data is used will have a big influence on the outcomes
of the project – feedback alone is not enough

Adjusting for “case-mix” is important if data is used for benchmarking
As always clean design and extreme simplicity will be a pre-requisite of the survey, as it
appears on devices.
© CBG Health Research Limited
30
Download