MH PbR Quality and Outcomes Report

advertisement

MH PbR Q&O Framework Report – Feb 2013

Mental Health Payment by Results:

Quality and Outcomes

Framework Report

Report for Product Review Group Quality & Outcomes Sub Group.

February 2013

1

MH PbR Q&O Framework Report – Feb 2013

Executive Summary

Summary

Good progress has been made during 2012/13 in developing a Quality and

Outcomes Framework for use in Mental Health Payment by Results (PbR). This report recommends the use of a suite of Quality Indicators and Outcome Measures in 2013/14.

To summarise key findings:

A set of quality indicators has been analysed and 7 are recommended for use.

The Clinician Rated Outcome Measure (CROM) based on HoNOS/MHCT is now recommended for use.

Good progress has been made on the development of a Patient Rated

Outcome Measure (PROM), with testing of the Warwick Edinburgh Mental

Well Being Scale (WEMWBS) receiving positive feedback from Service Users.

Further progress on the development of Patient Rated Experience Measures

(PREMs) and the recommendation to collect the “friends and family” question.

Completion of the IMHSEC website which provides guidance on the content of care package for each of the clusters.

Throughout the development of the framework there has been a strong focus on stakeholder engagement, including Service Users, which will continue into 2013/14.

National challenges remain in moving to a system based on payment for outcomes and recommendations are made to strengthen the work to date to ensure a fit for purpose framework in which payments can be used to incentivise outcomes in mental health.

Through 2013/14, further work will be undertaken to establish more detail of how the proposed indicators and outcome measures could be used as part of the payment mechanism.

The following summarises the key findings, recommendations and next steps for each part of the framework.

Quality Indicators

For all clusters, data for all of the recommended indicators is already collected through the Mental Health Minimum Data Set (MHMDS).

Providers and commissioners could confirm how they will review the indicator data collected on a cluster basis throughout 2013/14, using the recommended methodology provided within this report, and consider the value that these provide on a local basis.

2

MH PbR Q&O Framework Report – Feb 2013

Indicators analysed to date and recommended for use in 2013/14 are:

1. The proportion of users on CPA with a crisis plan in place.

2. The accommodation status of all users (as measured by an indicator of settled status and an indicator of accommodation problems).

3. The intensity of care (bed days as a proportion of care days).

4. The completeness of ethnicity recording.

5. The proportion of users in each cluster who are on CPA.

6. The proportion of users on CPA who have had a review within the last 12 months.

7. The proportion of users who have a valid ICD10 diagnosis recorded.

It is not expected that the indicators will have a direct financial impact during

2013/14.

Use of CROMs

A four factor model of the Health of the Nation Outcome Scales (HoNOS) has been developed and is due to be tested on national data from MHMDS submissions during

2013. It is expected that the statistical significance of average changes observed in the HoNOS total and four factor scores could be used to evaluate outcomes in each

PbR cluster. The clinical significance of changes observed in the HoNOS totals scores, which involves calculating the percentage of service users that meet the criteria for reliable improvement or deterioration, could also be used to evaluate outcomes in each PbR cluster. The results could be reported by cluster for each organisation/service provider.

At this stage providers and commissioners will need to assure themselves the

Mental Health Clustering Tool (MHCT) data (from which HoNOS can be derived) submitted for each cluster is accurate, complete and of high quality. This will require providers to ensure all MHCT items are recorded accurately and completely from

April 2013 onward at the required points in the service user journey, at initial referral assessment, routine review, significant change in presenting needs, and at discharge.

From April 2013 onwards every MHCT item score will form part of the baseline for the future outcome measure, hence accurate rating is essential.

Due to variability of the quality of scoring, data recorded from before this point will be disregarded for outcome measurement.

A user guide that provides further instructions on how this can be calculated locally will be made available on the Care Pathways and Packages Project (CPPP) website and the DH Quickr site in March 2013.

3

MH PbR Q&O Framework Report – Feb 2013

Use of PROMs

For all clusters a PROM should be used. It has not been possible to identify one universal PROM that adequately reflects the priorities for all of the clusters.

Testing of WEMWBS has recently taken place. As a result of this, further testing of the shortened version of WEMWBS will take place in 2013.

It is recommended that where no PROM is currently being used within an organisation, consideration should be given to using SWEMWBS (the shorter 7 item version) as the PROM of choice. Additional or different PROMs may also be used.

Commissioners and providers should ensure a PROM is being used for all of the clusters and that quarterly review of the data relating to this is undertaken.

To aid comparison and analysis, the PROM could be collected in line with the

MHCT, i.e. on completion of initial assessment, at routine review, significant change in presenting needs, and at discharge.

Patient Experience

As with PROMs there is no universally or agreed way to assess and report patient experience. Consideration is currently being given to the use for the Care Quality

Commission (CQC) service user survey as part of the PbR approach.

Commissioners and providers should agree local methods of assessing and reporting patient experience on an organisational and cluster basis, and review these on a quarterly basis.

Commissioners and providers should agree local activity to assess patient experience. Consideration should be given to using the following question.

How likely are you to recommend our services to friends and family if they needed similar care or treatment?

The scale below should be used to answer the question:

1 Extremely likely

2 Likely

3 Neither likely nor unlikely

4 Unlikely

5 Extremely unlikely

6 Don’t know

As with the PROM, this should be collected in line with the MHCT, i.e. on completion of initial assessment, at routine review, significant change in presenting needs, and at discharge.

Commissioners and providers should ensure that a quarterly review of the data relating to this is undertaken.

4

MH PbR Q&O Framework Report – Feb 2013

Further national work will be take place to analyse results. More information in relation to this activity will be made available on the CPPP website and the DH

Quickr site.

During 2013 work will be taken forward to expand the approach to including patient experience measures in the currency model.

Developing the Framework further in 2013/14

In 2013/14, the focus will be on refining the framework further, testing the measures in practice and developing further recommendations on how the indicators and outcome measures can be used to incentivise high quality care. Key objectives include:

Testing the CROM on National Data (MHMDS) and making progress on integrating PROM and possibly PREM data on MHMDS.

Further analysis of indicators and measures, including assessment of quality indicator data to establish what good performance is and how this can be measured

Further testing of PROMs and PREMs to establish suitability as part of the framework.

Ongoing stakeholder engagement to ensure that the right people, including service users, are engaged as the framework moves into implementation.

Making links to the payment mechanism.

5

MH PbR Q&O Framework Report – Feb 2013

Table of Contents

Executive Summary ................................................................................................... 2

1.0

Introduction ................................................................................................... 7

2.0

Quality Indicators ........................................................................................ 13

3.0

Clinician Rated Outcome Measurement ...................................................... 18

4.0

Patient Rated Outcome Measurement ........................................................ 22

5.0

Patient Rated Experience Measurement ..................................................... 29

6.0

IMHSEC Website ........................................................................................ 32

7.0

Summary ..................................................................................................... 33

6

MH PbR Q&O Framework Report – Feb 2013

1.0 Introduction

The purpose of this report is to make recommendations on the use of quality indicators, outcome measures and patient experience feedback as part of the Mental

Health Payment by Results (PbR) guidance for 2013/14.

The report sets out recommendations that are expected to form part of final 2013/14

Mental Health PbR guidance package, explaining the steps that were taken to reach the recommendations and setting out the further work that is required.

1.1 Background

Wider Government Focus:

The Coalition Government have made it clear that judging care by outcomes is one of its top priorities for the NHS. The White Paper “Equity and Excellence: Liberating the NHS” sets this out and is supported by the Outcomes Framework, published in

November 2010 and the mental health outcomes strategy

“No Health without Mental

Health ”, published February 2011.

The report “No health without mental health: developing an outcomes approach” published by the NHS Confederation in December 2011 sets out the wider context of developments on mental health outcome measures. It gives an in depth evaluation of the place of mental health within the NHS, Social Care and Public Health Outcomes

Frameworks as well as looking at their relation to “No health without mental health”.

Measuring and incentivising quality and outcomes in mental health care remains a top priority. The NHS Mandate for 2013/14 to 2014/15 (published November 2012) emphasises the parity of esteem – valuing mental and physical heath equally. It includes, amongst numerous references to mental health, emphasis on managing ongoing mental health conditions to achieve better quality of life, better access and reduced waiting times for mental health services and avoidance of suicide and self harm through effective crisis response.

The white paper “ Equity and Excellence” sets out the Governments key priority of improving patient experience . The 2012/13 Operating Framework made clear the priority for the NHS to put the patient centre-stage and to have a focus on improving patient experience:

“NHS organisations must actively seek out, respond positively and improve services in line with patient feedback. This includes acting on complaints, patient comments, local and national surveys and results from “real time” data techniques.”

The Department of Health pledged their ongoing commitment to recognising the importance of patient experience (DH 2012) by introducing the “friends and family test” and there will be a clear focus within the emerging NHS structures and organisations on prioritising the patient experience when commissioning care.

Clinical Commissioning Groups will be expected to commission care from organisations that improve the quality of patient experience through better insight provided by individual patient feedback (DH 2012).

PbR Policy Development

7

MH PbR Q&O Framework Report – Feb 2013

Use of the mental health clusters has been mandated from April 2011. The clusters are the currencies for most mental health services for working age adults and older people.

The development of the mental health currency groups is underpinned by the Mental

Health Clustering Tool (MHCT), a tool designed by clinicians for clinicians, to help assess the potential care needs of a service user. It supports professional judgement in the determining of a care cluster, and is based on the outcomes focused Health of the Nation Outcome Scales (HoNOS) system.

Providers and commissioners should understand and agree the care packages which will be delivered to service users in each currency cluster and also the quality indicators and outcome measures that follow these interventions.

The introduction of mental health PbR is a major change for both providers and commissioners. For the first time clinicians will have a direct impact on the funding that their organisation receives through their work to deliver high quality care and to achieve better outcomes.

Therefore, a key component of mental health PbR is being able to measure and understand the quality and quantity of the services that are being delivered and whether service users achieve positive outcomes.

Against this backdrop of the outcomes policy environment, a decision was taken to coordinate Quality & Outcomes work at a national level, taking account of regional and specialist input and building momentum towards delivering initial recommendations by the end of 2011/12. The DH led National Quality and

Outcomes Product Review Group (PRG) sub group was formed with the primary objective to identify indicators and outcome measures specifically linked to PbR currency groups, and to recommend how these could be used as an integral part of the currencies.

In November 2011, the DH published an initial report setting out a range of work streams to support this area. Work has continued through 2012 and the Quality and

Outcomes PRG sub group is now recommending for use a range of quality indicators and outcome measures for further testing and development during 2013-14. These are an integral part of the currency model to enable a better understanding of what a service is achieving, and eventually to enable some element of payment to be linked to quality.

1.2 Objectives

The purpose of the Quality and Outcomes work is to identify and recommend a range of quality indicators and outcome measures which will inform and support mental health PbR for use in 2013/14 and beyond.

The PbR Quality and Outcomes Report 2011 set out the following specific objectives:

1. To test and recommend the use of quality indicators linked to the 21 Clusters that form the basis of the currency model, using metrics that are currently collected

8

MH PbR Q&O Framework Report – Feb 2013 consistently on a national basis as part of the Mental Health Minimum Data Set

(MHMDS).

2. Establish a web-based tool that will provide guidance on the content of care packages for each of the 21 Clusters, linking NICE guidance, quality standards, evidence and best practice.

3. Recommend further work that can make better use of the existing data collected in the MHCT.

4. Recommend the use of Outcome Measures that will be linked to 21 Clusters, the range of measures will include; Clinician Rated Outcome Measures (CROMs),

Patient Rated Outcome Measures (PROMs) and Patient Experience Outcome

Measures (PREMs).

The following objectives were then agreed by the National Quality and Indicators

PRG Sub Group for 2012/13:

Work Stream

Quality Indicators

Clinician Rated

Outcome Measure

Objective

Test and recommend the use of quality indicators linked to the 21 clusters that form the basis of the currency model, using metrics that are currently collected consistently on a national basis as part of MHMDS.

To test and establish the use of MHCT/HoNOS ratings as a CROM as an indicator of clinical change on a cluster basis.

To test and establish the use of a PROM as an indicator of patient outcome on a cluster basis or super class level.

Patient Rated

Outcome Measure

Patient Rated

Experience Measures

To test and establish the use of PREMs that can be utilised at either a cluster or super class level.

IMHSEC Website Establishment of a web based tool to provide guidance on the content of care for each of the 21 clusters.

The ultimate aspiration of the Quality & Outcomes PRG subgroup is to focus on outcomes that demonstrate a person has progressed from a position beyond their mental illness by being empowered to gain control over their lives, using a recovery focussed personalised approach.

An objective of the work also is to implement the MH PbR Q&O Framework without creating unreasonable burdens on clinicians and services users and therefore the focus of activity has been to deliver the least number of measures which offer the greatest coverage to reduce time demands.

1.3 Scope

The scope of the work covered in this report is limited to the delivery of an outcomes framework for mental health PbR only, though the framework will take into account other Health Outcomes frameworks that have been developed.

9

MH PbR Q&O Framework Report – Feb 2013

A challenge is to ensure a balance of quality indicators, CROMs, PROMs and

PREMs whilst considering important and emerging initiatives relating to wellbeing and recovery.

These measures should complement each other and are not intended to replicate information from different perspectives.

The scope was agreed to identify a minimum of 1 Quality Indicator, 1 CROM, 1

PROM and 1 PREM that would be meaningful and beneficial in supporting mental health PbR and could be applied across each cluster.

1.4 Approach

Quality & Outcomes work has been co-ordinated at a national level, taking account of regional and specialist input. The Quality and Outcomes PRG sub group was formed with the primary objective to identify indicators and outcome measures specifically linked to PbR currency groups, and to recommend how these could be used as an integral part of the currencies. The group is part of the DH PbR governance structure and reports directly to the PbR Product Review Group (PRG) and PbR Project Board.

The approach taken was to appoint leads to each area of work with specific expertise in that subject area, ensuring that clinical leadership was in place but also supported by access to expertise in research and development, informatics and project management. Each work-stream lead has been tasked with engaging the national Quality and Outcomes Sub Group and ensuring that their proposals deliver on the key objectives.

A combination of approaches has been used to ensure appropriate and timely engagement with stakeholders, including service users. This has included a working group consisting of people with a wide range of experience and expertise in mental health plus liaison with other relevant mental health networks including IMROC and

Improving Access to Psychological Therapies for the Severely Mentally Ill networks.

Additionally, several workshops have taken place involving service users, voluntary sector representatives, members of the Care Pathways and Packages Consortium, commissioners and providers.

Service user specific workshops were held in October 2012. The aims of the workshops were:

To engage with service users to share the proposed Quality and Outcomes

Framework.

To gain service users thoughts regarding the proposed quality indicators.

To share and receive feedback proposed generic patient rated outcomes measures.

To share and receive feedback on proposed optional patient rated outcome measures, specific to diagnosis and care pathways.

10

MH PbR Q&O Framework Report – Feb 2013

To share the Care Quality Commission (CQC) survey questions and establish service user opinions on what further actions Trusts take regarding monitoring and sharing patients rated experiences.

Current membership of the national Quality and Outcomes PRG sub group and

Quality Indicators working group can be found in Appendix 1.

The diagram below summarises the Q&O MH PbR Framework model in development.

11

MH PbR Q&O Framework Report – Feb 2013

12

MH PbR Q&O Framework Report – Feb 2013

2.0 Quality Indicators

2.1 Scope

The purpose of the Quality Indicators work-stream is to recommend a suite of quality indicators that can be used within the PbR model as a basis for payment according to quality. Indicators will be linked to clusters and available by mental health service provider. They will be monitored through the MHMDS which collects national and consistent data through the NHS Information Centre for Health and Social Care.

Activity on the Quality Indicators focused only on data which is already collected on a national basis. The data therefore can be collected now and also used throughout next year (2013/14).

The 2012/13 plan for activity had three distinct phases:

1. Identification of a wide range of indicators and through initial review and scrutiny, recommending a number for detailed data collection and analysis

2.

3.

Agreeing definitions for the selected indicators and undertaking analysis.

Interpreting results of the data analysis, drawing conclusions and making recommendations.

The aim was to establish a limited number of worthwhile, thoroughly assessed quality indicators which can be implemented in 2013/14 and form a coherent part of the MH PbR Quality and Outcomes Framework.

2.2 Progress and findings

During 2012/13, 9 indicators have been analysed and from these 7 can be recommended for use in 2013/14.

The Quality Indicators Working Group commenced activity by using the Quality and

Outcomes report from 2011 to identify a long list of potential quality indicators.

Two stakeholder events took place in June 2012, involving a range of commissioner and providers with the aim of identifying the most meaningful indicators from the list.

The working group then considered the results in terms of importance of indicators and availability to analyse from MHMDS and selected a shorter list of indicators for further, more detailed, analysis and evaluation. The complexity of the analysis and availability of high quality data in MHMDS was considered for each indicator, and indicators were prioritised for analysis. In October 2012, service users were asked to agree the most important indicators from the shortened list.

The Information Centre undertook the analysis of indicators, with data analysed covering five quarters from April 2011 until July 2012 and there were approximately

750,000 user records included. Data for 68 providers on the MHMDS were included

13

MH PbR Q&O Framework Report – Feb 2013 and provider differences assessed. By end December 2012 analysis was available for 9 indicators.

The table below shows the long list of indicators considered, their current status, and highlights the indicators identified as the most important at the consultation events:

Indicators identified in October 2011

Report Current Status

Crisis plan in place Recommended

Rated

Important

- CPPP workshops

Rated

Important

- Service

Users



Accommodation status (2 sub indicators)

Intensity of Care -

Average number of bed days per annum

Equity of access – ethnicity

Recommended

Recommended

Recommended

Comments

Recommended for those on CPA

Proportion in settled accommodation;

Proportion with little problem with accommodation (HONOS item 11)

Proportion of service users on

CPA

Percentage of service users on

CPA reviewed annually

Recording of diagnostic code

Percentage of service users followed up in 7 days following discharge.

Employment status

Readmission rates

Recommended

Recommended

Recommended

Analysed, not yet recommended

Analysed - not yet recommended

Passed initial screening







Average length of stay

Waiting times/access to service

Admission Rates

Delayed discharge

Service User

Experience

Crisis Resolution

Home Treatment episode rates

Duration of untreated psychosis (DUP)

Passed initial screening

Passed initial screening

Passed initial screening

Passed initial screening

Passed initial screening

Passed initial screening

Passed initial screening











Issues with data collection, on hold until further analysis undertaken.

Further evaluation of incomplete recording and variation in local employment rates required

Initial analysis by the IC led to the conclusion that this indicator could not currently be compiled reliably.

Higher priority given to intensity of care (bed days) and readmission rates.

Data recorded on MHMDS not currently sufficient for meaningful quality indicator.

Higher priority given to intensity of care (bed days) and readmission rates.

No cluster information - not available on

MHMDS

No cluster information and sample size low. To be covered in PREM activity

Difficulties collecting consistent data

Lack of national data in 2011/12

14

MH PbR Q&O Framework Report – Feb 2013

Indicators identified in October 2011

Report

Section usage

Rated

Important

- CPPP workshops

Rated

Important

- Service

Users Current Status

Passed initial screening

Comments

Other indicators given higher priority due to uncertainty of interpretation as quality indicator

Early Intervention care

Passed initial screening

Indicator service specific and hence not considered a generic quality measure.

MHCT – HONOS Not applicable Covered in CROM activity

Increased choice of care and treatment

Not applicable More applicable to PREM activity

IAPT, KPIs (x3) Not recommended

Out of scope

Serious Untoward

Incidents

Not recommended

Data not collected nationally in a consistent and reliable way.

Physical health- smoking cessation

Not recommended

Lack of national data.

Physical health – BMI

/ Waist

Not recommended

Lack of national data.

London Physical

Health CQUINN measures

Use of anti-psychotic drugs

Not recommended

Not recommended

Lack of national data.

Lack of national data.

Crisis Team/ Home

Treatment Teams

Not recommended

Due to reconfiguration by providers, services may be provided by community teams

Safe and High Quality

Care

Not recommended

Data not collected nationally in a consistent and reliable way.

Information for people in receipt of care and treatment

Not recommended

Only three mental health trusts have certification - possible future indicator

Physical health check Not recommended

Lack of national data.

The percentage of service users stepping up and down clusters

Percentage exceeding expected cluster duration

Not yet recommended

Not yet recommended

Lack of sufficient data at time of initial shortlisting.

Lack of sufficient data at time of initial shortlisting.

The detailed findings of the nine indicators analysed to date, with further rationale for inclusion/exclusion is available in Appendix 2.

In summary, key findings based on the set of data analysed to date:

7 Indicators can be readily used in 2013-14,

Indicators selected are strongly linked to key domains such as quality of life, safety and positive experience and are without perverse consequences,

Indicators selected have been thoroughly assessed with confirmation of at least adequate data quality, and

Proposed indicators are suitable for a range of specific clusters.

15

MH PbR Q&O Framework Report – Feb 2013

2.3 Recommendations

Taking into account both the agreed importance of the indicator and the results of the analyses, the indicators recommended for use are:

1. The proportion of users with a crisis plan in place (limited to those on Care

Programme Approach (CPA)).

2. The accommodation status of all users (as measured by an indicator of settled status and an indicator of accommodation problems).

3. The intensity of care (bed days as a proportion of care days).

4. The completeness of ethnicity recording.

5. The proportion of users in each cluster who are on CPA.

6. The proportion of users on CPA who have had a review within the last 12 months.

7. The proportion of users who have a valid ICD10 diagnosis recorded.

Guidance, which shows how the 7 recommended indicators were calculated from

MHMDS, is provided in Appendix 3 to support local analysis in the first instance until national reporting is possible in the future.

There were particular data difficulties with 2 additional indicators that were analysed:

1. Percentage of users who have a 7 day follow up after discharge from hospital, and

2. Employment status.

Based on current findings, it is not possible to recommend 7 day follow up after discharge from hospital with data at provider-cluster level being inadequate.

The proportion of users in employment (irrespective of CPA) has also been analysed. Whilst this has the potential to be a valuable indicator, it is not yet recommended as further evaluation is needed of the effect of incomplete recording and of variation in local employment rates.

Also, access has recently become available to analysis of Duration of Untreated

Psychosis (DUP) data from London trusts. Because of the importance of DUP as a measure of access and a predictor of outcomes it is recommended that further coordinated work is carried out on national DUP data during 2013-14.

At this stage, it is suggested that proportion of people on CPA and people with a

CPA annual review are appropriate quality indicators mainly for psychosis clusters with consideration for some non psychotic clusters e.g. Clusters 5 and 8. Intensity of care would be appropriate for non psychosis and some selected psychosis and organic clusters. The other four indicators are, in principle, relevant to all clusters.

16

MH PbR Q&O Framework Report – Feb 2013

However, even if an indicator is recommended for specific clusters it is good practice to review data for all clusters for a variety of reasons including the potential misallocation to clusters. It is therefore proposed that all 7 indicators are recommended for all 21 clusters.

It is also recognised that there are other indicators that would be very important to use such as physical health and smoking cessation. Local work is occurring to consider a wider range of indicators and this will be valuable going forward.

2.4 Next steps

It is not expected that the indicators will have a direct financial impact during

2013/14.

The data relating to these indicators will continue to be analysed on a national basis during 2013. This will take account of issues such as the application of CPA guidelines, issues relating to employment and the need to be able to account for rapidly changing populations.

Opportunities to consider the indicators alongside other data such as the Quality and

Outcomes Framework (QOF) data on a Clinical Commissioning Group (CCG) basis could also be considered by commissioners and providers on a local basis.

The national Quality and Outcomes PRG Sub Group will prioritise the analysis of any further indicators for potential addition to the set in 2013/14 such as clustering quality metrics (proportion of service users clustered, adherence to maximum cluster review periods and adherence to care transition protocols), taking into account the amount of resource required to the analyse, interpret and recommend each indicator.

Providers and commissioners should ensure that data is collected for all 7 recommended indicators for all clusters through 2013/14. Providers and commissioners should consider how the information will be used locally and what issues and questions they raise.

Feedback will be sought to help the decision on how these can be practically implemented.

17

MH PbR Q&O Framework Report – Feb 2013

3.0 Clinician Rated Outcome Measurement

3.1 Scope

From April 2011 the use of the MHCT has been implemented nationally. HoNOS sits within the MHCT and is reported through MHMDS. The 2011 Quality and Outcomes

Mental Health PbR report identified HoNOS/MHCT as the only CROM that is routinely used, collected and reported to the national data set.

Therefore, the scope of developing a CROM to support the currency model was limited to test and establish the use of MHCT/HoNOS ratings only, focussing on:

1. Exploring the utility of the MHCT/HoNOS as a generic Clinician Rated

Outcome Measure for evaluating outcomes within all the PbR super classes and care clusters for working age and older adult mental health services.

2. Exploring the factor structure of the MHCT/HoNOS to identify the most generalisable factor structure across the PbR clusters.

3. Evaluating the MHCT/HoNOS mean total scores and total factor scores from any emergent factor structure in terms their sensitivity to change over time.

4. Exploring the utility of Parabiaghi et al.’s (2005) clinical significance approach as a generic outcome measure for demonstrating the quality of outcomes observed.

It was established that the use of single items from either the MHCT or HoNOS was unlikely to prove helpful in evaluating outcomes for PbR as this would reduce the potential to capture a breadth and depth of health and social functioning outcomes, and could introduce issues of ceiling and floor effects.

Reporting outcomes for each individual scale within each cluster was also ruled out as this may lead to a lengthy, overly detailed outcome reporting mechanism, with potential for additional central resource requirements to manage the process. The study, therefore, focused on those approaches that provided useful, meaningful summaries of the outcome data such as factor models, total scores and clinical significance.

Factors are dimensions within a measurement tool that describe a collection of items

(or scales in the case of HoNOS) that correlate with each other and form an overarching scale of measurement in their own right. They summarise and describe data in a meaningful way without losing any of the original data. They improve the sensitivity of an outcome measure for detecting change and are able to describe the dimensions in which change has been affected. Within a factorial approach, the total score, therefore, represents the sum of the factor or dimension scores within the measurement tool. There is currently no published research relating to the factor structure of the MHCT, however, several factor models have been identified for the

HoNOS.

18

MH PbR Q&O Framework Report – Feb 2013

Clinical significance looks at the degree to which such changes have impacted on service users’ overall quality of life’ or health and social well-being/functioning within the context of a mental health population. It involves applying two formulae to place service users into a category of ‘severity’ based on individual scores at referral to mental health services. Reliable change indices and clinically significant cut-off scores are then calculated and subsequently used to evaluate the clinical significance of any change observed in total scores, which is represented by three classification categories:

clinically significant improvement,

stability, and

clinically significant deterioration.

3.2 Progress and findings

In 2012 a study was undertaken to explore the utility of the MHCT as a generic

CROM for evaluating outcomes within the PbR quality and outcomes framework for working age and older adult mental health services.

A principal component analysis revealed a five factor structure for the MHCT but confirmatory factor analysis suggested the model was not generalisable to all PbR clusters.

A new four factor model of the HoNOS was developed during the course of the study using principal component analysis and represented a significantly better factor structure than the MHCT factor model. The 4 factors in the new model are:

1. Personal Well-Being,

2. Emotional Well-Being,

3. Social-Well Being, and

4. Severe Disturbance .

It was compared to existing factor models of the HoNOS in terms of generalisability across the PbR clusters using confirmatory factor analysis.

The new four factor model was the only model to generate fit statistics for all levels of the data and produced the best fit statistics overall with the exception of the PbR clusters 5 and 8 data sets, in which another model produced better fit statistics, and

PbR care clusters 1 and 2, which produced similar fit statistics. This suggested the new four factor model of HoNOS was the most generalisable model across all of the

PbR super classes and clusters within working age adult and older adult mental health services.

The new four factor model of HoNOS was found to be sensitive to change over time and was able to detect statistically significant changes in the average/mean HoNOS total scores and factor scores between referral and first review for all PbR clusters.

The application of Parabiaghi et al.’s (2005) formulae for calculating the clinical significance of changes observed in total HoNOS scores between referral and discharge resulted in a promising model for evaluating the quality of outcomes reported in mental health services. The model was able to demonstrate and describe

19

MH PbR Q&O Framework Report – Feb 2013 the percentage of service users who showed clinically significant improvement, stability, and deterioration for all PbR clusters. The initial evaluation suggested the approach could potentially add significant value when combined with the new four factor model.

Therefore, the findings of the study suggest that based on best statistical fit, as opposed to ‘good statistical fit’, a new four factor model of the HoNOS currently represents the most generalisable factor model available for evaluating outcomes across the PbR clusters compared to other factor models.

More detailed findings can be found in the report on using MHCT as an Outcome

Measure for the MH PbR Q&O Framework ( click here ).

3.3 Recommendations

It is recommended that the new four factor model of the HoNOS is adopted as the generic CROM for mental health PbR and outcomes are evaluated in terms of the statistical significance of changes in average HoNOS total scores using the effect size as a quality key indicator. The effect size is a statistic that gives an indication of the impact an intervention or service is having on the clinical population (NOTE: It is the HoNOS scores of the MHCT assessment that will be used for outcome evaluation ).

The factors should be used as overarching summaries for the HoNOS scales they represent and factor scores should be interpreted as the sum of those items making up each of the factors. The factors allow both commissioners and service providers to see high level summaries of where improvements and deterioration are taking place and should form the basis of understanding for where positive outcomes are being achieved and where further improvements may be required. The factors can also be evaluated and interpreted in terms of the effect sizes achieved.

It is recommended that clinical significance is also considered as part of the outcome evaluation procedure associated to the four factor model. However, it will be important to calculate the reliable change indices again using the MHMDS to ensure they are accurate and fit for purpose nationally.

To support the use of this model, providers should ensure they are collecting MHCT

(and therefore HoNOS) scores as recommended in the guidance and the individual items are scored correctly.

From April 2013 onwards every MHCT item score will form part of the baseline for the future outcome measure, hence accurate rating is essential.

Due to variability of the quality of scoring, data recorded from before this point will be disregarded for outcome measurement.

3.4 Next steps

From April 2013, providers and commissioners will need to ensure:

20

MH PbR Q&O Framework Report – Feb 2013

The MHCT data submitted for each cluster is accurate, complete and of a high quality;

Service user needs have been fully assessed and accurately reflected in the scoring of all MHCT items (including all HoNOS items) at referral, any review and discharge.

Providers and commissioners should consider the introduction of mechanisms to incentivise submissions of reliable, high quality MHCT data for each cluster. This should be adequately supported and evidenced by clinical records and PROMS, with appropriate audit undertaken.

The Q&O PRG sub group will undertake further work during to 2013/14 to establish:

How the model works when applied to national data,

How the CROM can be directly reported from MHMDS,

H ow service providers’ performance could be evaluated,

How to incentivise good practice,

How this could be linked to a payment mechanism for PbR.

MHCT data will be required from the MHMDS to thoroughly test the proposed approach and its potential application nationally.

A full report will follow with recommendations once the analysis from the initial testing has been completed. A user guide that provides further instructions on how this can be calculated locally will be made available on the Care Pathways and

Packages (CPPP) website and DH Quickr site in March 2013.

21

MH PbR Q&O Framework Report – Feb 2013

4.0 Patient Rated Outcome Measurement

4.1 Scope

An extensive range of outcome measures are available for use across all areas of mental health. These include generic measures, diagnosis specific measures, measures commonly used in therapy, particularly psychological interventions (NHS compendium (NIMHE 2008). No single measure has been identified as having validity across all areas of metal health for all service users.

A range of approaches were considered for using PROMs as part of the PbR process which included:

One PROM per cluster,

One PROM per superclass,

One PROM across all clusters.

Organisations are already using a range of PROMS on a diagnosis/pathway basis locally. The national Q&O PRG sub group are keen for this to continue whilst further testing is undertaken of a measure suitable for PbR purposes. It was therefore agreed in the first instance to limit the scope of activity to establishing one PROM that could be implemented across all clusters.

The Q&O PRG sub group discounted development of a new measure and set the scope of the activity to include evaluation of a group of existing measures that could potentially be used across all clusters. The following key criteria was put in place for the evaluation:

No training required/no costs for training

No copyright issues/free for use

Cost effectiveness (locally and nationally including software to analyse results)

Must take less than 20 minutes to complete by service users

Easy to use and interpret

Tools would remain in scope where validity and reliability was not published.

4.2 Progress and findings

An initial long-list of measures was identified for further analysis and evaluation against the key criteria. It was agreed that any shortlisted measure would be given further scrutiny at workshops by providers, commissioners and service users. The aim was then to select the most suitable measures for further testing before agreeing on a preferred measure to test as part of the PbR package.

The follow series of measures were initially evaluated:

22

MH PbR Q&O Framework Report – Feb 2013

Amritsar Depression

Inventory

GHQ-12

Beck hopelessness scale Hospital anxiety and depression scale

CANSAS Illness perception questionnaire

Schwartz - outcome scale

Sheean disability scale

Short form 36

CHOICE Ohio consumer survey assessment

PHQ9

Social adaptation self evaluation scale

Social adjustment scale CORE and associated versions

Cues

Dreem

Warwick Edinburgh Mental

Well Being Scale

EQ-5D

Profile of mood states

Psychlops

Recovery Star

Social functioning scale

The how are you scale

Work and social adjustment scale

Sheean disability scale Role functioning scale

From these, a shortlist of measures emerged that met all of the key criteria, as follows:

CHOICE

EQ-5D

Short form 36

Warwick Edinburgh Mental Well Being Scale (WEMWBS)

Work & Social Adjustment Scale

World Health Organisation Disability Assessment Scale (WHO-DAS)

Service Users highlighted a number of key considerations giving clear direction that measures should focus on recovery and wellbeing rather than diagnosis and symptoms. Other recurring themes were:

23

MH PbR Q&O Framework Report – Feb 2013

Relapse prevention,

 resilience,

 quality of life,

 self care,

 sleep,

 frequency of reoccurrence of ill health and ability to cope with the same,

 knowing where to re-access help quickly and by appropriately qualified and skilled professionals who can be trusted,

 recovery – back to who I am,

 feeling respected and treat with dignity,

 getting better,

 having appropriate tool kit to cope before being discharged and during periods of functioning,

 sustained improvement,

 social functioning,

 reaching personal goals,

 improving health,

 coping mechanisms.

The service users recognised that no area could be taken in isolation and agreed the following diagram presented what they would want a PROM to consider:

Many comments received from provider organisations mirrored the findings of

University of York, Investigating patient outcome in mental health, May 2009, Centre for Health Economics, paper 48, and barriers to effective implementation of any measure included the need to ensure clinicians and service users were able to see the clinical benefits of using the PROM. Poor information technology systems, change in culture, time constraints, training and lack of interest from patients were all identified as barriers. It was felt that efforts should be made in reducing these barriers when choosing the PROM.

Service users were asked to comment on the shortlist of measures. A range of comments were received regarding style and layout, ease of completion and length, and appropriateness of questions. A summary of responses can be found in

Appendix 4. Service users were in overwhelming agreement that WEMWBS was the most suitable outcome measure.

24

MH PbR Q&O Framework Report – Feb 2013

6

7

8

10

11

12

13

In parallel to this activity, WEMWBS was also recognised as the PROM of choice by service users in a joint study by the Mental Health Research Network, which is available at: http://www.mhrn.info/data/files/MHRN_PUBLICATIONS/REPORTS/outcome_measu res_report.pdf

As a result, WEMWBS was chosen as the preferred PROM for initial testing. A usability and collectability pilot was conducted across 7 sites during November and

December 2012 to determine the tools potential usage on a cluster basis. 625 responses have been included in the initial analysis.

87% of survey responses were from Working Age Adults, 11% Older People, and 2% not known. 71% of responses were from service users in the psychotic superclass, 24% nonpsychotic, and 5% organic.

From the initial results, the table below gives a summary of usability of response by cluster:

Total 25

5

30

6

1

1

160

Patient alone

1

1

9

17

8

Patient declined

1

Patient unable to give informed consent

1

15

19

21

28

43

49

38

10

1

19

18

1

3

1

1

6

3

3

15

18

2

1

3

3

2

1

1

2

1

4

1

Blank

2

1

23

7

1

1

9

7

1

1

1

4

3

1

6

1

309 52 10 69

Grand

Total

2

16

28

1

8

14

21

25

28

41

66

114

83

15

13

3

2

23

1

32

53

36

625

25

MH PbR Q&O Framework Report – Feb 2013

Taking the columns ‘assisted by carer’, ‘assisted by health professionals’ and ‘patient alone’ as completed responses, ‘patient declined’ and ‘patient unable to give consent’ as not completed and discounting the blanks gives the following results:

11

12

13

14

15

6

7

8

10

3

4

5

Cluster

0

1

2

16

17

18

19

20

21

Not Clustered blank or unclear

Completed

1

0

2

12

24

13

53

90

64

15

1

20

24

26

37

24

48

15

10

3

0

1

Not

Completed total

1

0

0 2

2

0

0 12

1 25

0 13

0 20

0 24

1 27

3 40

4 57

17 107

19 83

6 21

0 1

1 25

4 52

0 15

0

0

3 13

0 3

0

1

% completed

50%

N/A

100%

100%

96%

100%

100%

100%

96%

93%

93%

84%

77%

71%

100%

96%

92%

100%

77%

100%

N/A

100%

11 2 13 85%

Total 494 62 556 89%

Overall, findings would suggest a strong completion/usability rate across clusters.

However:

Data is limited in some clusters, in particular 1, 2, 15 and 21.

It is understood that in some instances ‘not completed’ responses haven’t been recorded. It is not known how many cases are involved but it would lower the overall percentage completed.

The charts below summarise some of the other key findings from the pilot.

Ease of completion:

26

MH PbR Q&O Framework Report – Feb 2013

Time Taken:

Importance of Questions:

27

MH PbR Q&O Framework Report – Feb 2013

Recently, the Short Warwick Edinburgh Mental Well Being Scale (SWEMWBS) has emerged as a preferred version for further testing. The Health and Quality of Life

Journal ( http://www.hqlo.com/content/7/1/15 ) published a report which found better psychometric properties in the 7 item measure than the longer 14 item version. The study also showed these items have no gender bias, and the age range used in the sample suggests it can be reliably applied to people up to the age of 74, making it appropriate for adult and mental health services for older people.

4.3 Recommendations

The use of a PROM is recommended as part of the PbR 2013/14 road test package and further emphasised in the Mental Health Payment by Results in 2013-14 letter from the DH in January 2013.

Further research is required to explore the use of the Short Warwick Edinburgh

Mental Well Being Scale (SWEMWBS). It is suggested that where no PROM is currently being used, organisations consider the use of one of the WEMWEBS measures, with a preference for the shortened version .

Provider organisations should continue to use a range of PROMs dependant on pathway and/or diagnosis basis as chosen to meet their needs.

Commissioners and providers should ensure a PROM is being used for all of the clusters and that quarterly review of the data relating to this is undertaken.

To aid comparison and analysis, the PROM should be collected in line with the

MHCT, i.e. on completion of initial assessment, at routine review, significant change in presenting needs, and at discharge.

4.4 Next steps

Further work will continue in 2013/14 to test the short WEMWBS, including its potential ability to demonstrate service user outcomes on a cluster basis. The effects of mental health professionals and/or carers assisting service users to complete the

PROM will also be considered.

Concerns about use of the tool for service users with Bi Polar Affective Disorder have been raised. Work currently being conducted by the IAPT Severe Mental

Illness (SMI) group will continue to inform these developments.

Provider organisations should take steps to use SWEMWBS in the absence another

PROM from April 2013. More information in relation to testing SWEMWBS will be available on the CPPP website and the DH Quickr site.

Work will begin in 2013 to establish how MHMDS can accommodate PROM data.

Consideration will be given to carer involvement in quality and outcome activity.

28

MH PbR Q&O Framework Report – Feb 2013

5.0 Patient Rated Experience Measurement

5.1 Scope

The national Care Quality Commission (CQC) Community Survey is used to explore patient experience of services. The survey is anonymous and is not linked to individual patients or reported on a cluster basis. In addition to the national survey, approaches to measuring patient experience have been developed locally.

Consideration was given to limiting the scope of activity to national survey and integrating that within the PbR framework. However, it was agreed by the national

Q&O PRG sub group to widen the scope to include consideration of local approaches as this could lead to a more relevant measure being developed.

Therefore the scope of the work was extended, as follows, to establish:

Whether and how the CQC survey could be used as part of the MH PbR

Framework,

Whether there are any patient experience questions used locally that could be used as part of a measure, and of so

Whether they could be used in conjunction with or separate to the CQC survey.

It was established that changes to the CQC survey would not be possible in 2012.

This led to a focus on how best to use existing questions in the survey rather than attempting to refine the questions. It also meant linking patient responses to the clustering currency model would not be possible within the scope of the analysis of the survey.

5.2 Progress and findings

Through 2012 good progress has been made in establishing service user preferences about patient experience, particularly from the CQC survey, establishing what patient experience questions are asked locally and how these could be integrated within the MH PbR framework.

The CQC survey questions were shared with services users at a consultation event in October 2012. Service Users were asked to rate the relevance of the questions within the survey. The following 12 were rated as the most important:

1. Did you feel carefully listened to the last time you saw your NHS health care worker?

2. Did this person take your views into account?

3. Did you have trust and confidence in this person?

4. Do you understand what is in your NHS care plan?

5. Do you have a telephone number to contact your mental health service out of hours?

6. In the last 12 months have you received any sort of talking therapy from NHS mental health services?

29

MH PbR Q&O Framework Report – Feb 2013

7. In the last 12 months have you had a care review meeting to discuss your care?

8. Do you think your views were taken into account when deciding what was in you NHS care plan?

9. Do you think your views were taken into account when deciding which medication to take?

10. Overall rating of the service you have received?

11. In the last 12 months has your NHS worker checked how you are getting on with your medication?

12. Time period for last seeing someone form mental health services

Consideration was then given to whether and how these questions could be tested and integrated within the PbR framework. An option was for provider trusts to ask service users for information on these 12 questions as well as using the full CQC survey and local experience measures. This option was quickly discounted due to the extra demands it would place on providers and service users

The consultation therefore sought to explore, from these 12 questions, which were consistently asked across their organisations.

The following three questions, though asked in different ways, emerged:

1 Did you feel carefully listened to the last time you saw your NHS health care worker?

5 Do you have a telephone number to contact your mental health service out of hours?

8 Do you think your views were taken into account when deciding what was in you NHS care plan?

Approaches to collecting the data were varied, with paper, text, and electronic devices used, differing spreads of service users across in-patient and community settings, and different timings of collection. None of the sites that responded asked all of the questions at initial assessment, at each CPA, and on discharge.

Also, in 2012, the DH introduced the “friends and family test”. Although initially targeted at elements of acute services the intention is to eventually roll the test out to all those using NHS services. Therefore, consideration has also been given to how this can be included within mental health services as part of the mental health PbR work in 2013/14.

5.3 Recommendations

Commissioners and providers should agree local activity to assess patient experience. Consideration should be given to using the following question.

How likely are you to recommend our services to friends and family if they needed similar care or treatment?’

The scale below should be used to answer the question:

1 Extremely likely

30

MH PbR Q&O Framework Report – Feb 2013

2 Likely

3 Neither likely nor unlikely

4 Unlikely

5 Extremely unlikely

6 Don’t know

It is recommended that providers report on total numbers of service users that are given the opportunity to respond, numbers of responses and the breakdown of the response categories, at organisational level initially.

The intention will be to report at provider level nationally and to also be capable of displaying performance at CCG level.

As with the PROM, this should be collected in line with the MHCT, i.e. on completion of initial assessment, at routine review, significant change in presenting needs, and at discharge.

Commissioners and providers should ensure that a quarterly review of the data relating to this is undertaken.

As yet, we are unable to recommend the additional questions and more pilot work will be undertaken during 2013 to develop a national approach, which will also consider whether patient experience can link to recovery.

5.4 Next steps

In 2013/14, further work will be undertaken to develop a comprehensive approach to include patient experience measures within the currency model.

Consideration will be given to how the agreed PREM question(s) can be reported via

MHMDS in the future.

Agreement will be reached on the role that the CQC survey will have in the PbR

Framework.

Further national work will be undertaken to analyse results of the “friends and family” question, on a cluster basis. More information in relation to this activity will be made available on the CPPP website and the DH Quickr site.

31

MH PbR Q&O Framework Report – Feb 2013

6.0 IMHSEC Website

The IMHSeC group (a partnership between The Pharmaceutical Serious Mental

Health Initiative (PSMI), the DH Mental Health Directorate and the NHS

Confederation Mental Health Network) has worked with the PbR Quality & Outcomes sub group to develop a web-based tool that provides guidance on the content of the care package for each of the clusters.

This work was identified as being an important step in enabling the linkage between inputs and outcomes to be established and to aid understanding of the overall service user journey. The key features of this work were to collate work already carried out, to provide clear guidance on which interventions could be delivered / commissioned within each of the clusters, and to link to NICE guidance and standards.

A website is now in place that provides a tool to help commissioners and service providers to set out local arrangements for care packages linked to the care clusters, which includes a high level view of good practice pathways. This can be used locally as a starting point, and a framework for more detailed local planning and operationalisation.

Currently, work is being undertaken to update the site with the latest Quality and

Outcomes developments and it is anticipated that regular refreshes of information will be undertaken to keep the information up to date and relevant.

32

MH PbR Q&O Framework Report – Feb 2013

7.0 Summary

The National Quality and Outcomes PRG sub group has overseen the ongoing development of the Q&O Framework for MH PbR in 2012/13, including:

7 quality indicators analysed and recommended for use.

Recommendation for use of the CROM based on HoNOS/MHCT.

Progress made on the development of a PROM. Testing of WEMWBS has received positive feedback from Service Users and further testing of the short version of WEMWBS will take place in 2013/14.

Further progress on the development of PREMs and the recommendation to collect the “friends and family” question.

Completion of IMHSEC website which provides guidance on the content of care package for each of the clusters.

National challenges remain in moving to a system based on payment for outcomes and recommendations are made to strengthen the work to date to ensure a fit for purpose framework in which payments can be used to incentivise outcomes in mental health.

It is not expected that the indicators will have a direct financial impact during

2013/14.

In 2013/14, the focus will be on refining the framework further, testing the measures in practice and developing further recommendations on how the indicators and outcome measures can be used to incentivise high quality care. Key objectives include:

Testing the CROM on National Data (MHMDS) and making progress on integrating PROM and possibly PREM data on MHMDS.

Further analysis of indicators and measures, including assessment of quality indicator data to establish what good performance is and how this can be measured.

Further testing of PROMS and PREMS to establish suitability as part of the framework.

Ongoing stakeholder engagement to ensure that the right people, including service users, are engaged as the framework moves into implementation.

Making links to the payment mechanism.

33

MH PbR Q&O Framework Report – Feb 2013

Providers and commissioners will need to ensure that the following is put in place in

2013/14:

Quality Indicators - confirm how they will review the indicator data collected on a cluster basis throughout 2013/14 and consider the value that these provide on a local basis.

CROM

– gain assurance that MHCT data submitted for each cluster is accurate, complete and of high quality. This will require providers to ensure all

MHCT items are recorded accurately and completely from April 2013 onward at the required points in the service user journey.

PROM - ensure a PROM is being used for all of the clusters and that quarterly review of the data relating to this is undertaken. It is recommended that where no PROM is currently being used within an organisation, consideration should be given to using SWEMWBS.

PREM – Consideration should be given to routinely asking and collecting the

“friends and family” question.

CROM, PROM, and PREM data should all be collected in line with the MHCT, i.e. on completion of initial assessment, at routine review, significant change in presenting needs, and at discharge.

34

Download