Urgent Care in General Practice Out of Hours Benchmark Urgent Care Centres Primary Care in A&E Meeting the quality and productivity challenge in out of hours care: what can we learn from the out of hours benchmark? Improving Patient Safety in Out of Hours Care 22nd June 2010 Rick Stern Director, Primary Care Foundation, NHS Alliance Lead for Urgent Care rick.stern@primarycarefoundation.co.uk 07709 746771 © Primary Care Foundation What I will cover The current context – learning from recent reviews The out of hours benchmark – what we can learn about quality and productivity How can this lead to improvements in patient care and safety? Shifting the focus: from organisational to individual Improving the benchmark Where are we heading … the future of OOH and urgent care © Primary Care Foundation New Leadership Group for Urgent Primary Care Dr Albert Benjamin Anita Dixon Clinical Director Waldoc CBS (Waldoc Ltd) Chief Executive Central Nottinghamshire Clinical Services Alan Franey Jayne Hetherington Eddie Jahn Dr Darren Mansfield Chief Executive Barndoc Healthcare Ltd General Manager OWLS CIC Ltd Managing Director Harmoni GP Clinical Lead in Urgent Care Chief Executive NHS Bolton Alison McWilliam General Manager Dr Ray Montague Dr Russell Muirhead Diane Ridgeway Dr Bruce Websdale Gilly Wilford Medical Director Nottingham Emergency Medical Services Limited (NEMS CBS) Brisdoc Healthcare Services Chairman Shropshire Doctors Cooperative Ltd Chief Executive East Lancashire Medical Services Ltd Medical Director Primecare Director of Finance & Contracts Chief Executive South East Health Lesley McCourt Nigel Wylie Partnership of East London Co-operatives Urgent Care 24 © Primary Care Foundation Emerging Priorities 1. Patient Safety 2. Integrated Urgent Care 3. Demonstrating quality 4. ‘Rebranding’ Out of hours © Primary Care Foundation The Primary Care Foundation developing best practice in primary and urgent care Urgent Care in General Practice Out of Hours Benchmark Urgent Care Centres Primary Care in A&E A resource for commissioners of urgent care © Primary Care Foundation A long history of reports and reviews … ● Department of Health (Carson Review, 2000) Raising Standards for patients: new partnerships in Out-of-Hours care ● National Audit Office (May 2006) The Provision of Out-of-Hours care in England ● Four inner London PCTs (May 2007) Report into the death of Penny Campbell ● Health Care Commission (September 2008) Not just a matter of time: A review of urgent and emergency care services in England ● Primary Care Foundation (January 2010) Improving out of hours care: what lessons can be learned from a national benchmark of services? ● Department of Health (February 2010) Out-of-Hours Services: project to consider and assess current arrangements and still to report … ● Care Quality Commission (still ongoing) Enquiry into Take Care Now © Primary Care Foundation What can we learn from recent reports on out of hours services? Key areas in the Department’s Review ● ● ● Commissioning and performance management, including tackling inappropriate variation Selection, Induction, Training and use of out-of-hours clinicians (including the use of locums) Management and operation of Medical Performers Lists Actions following on from the Review: Reviewing the National Quality Requirements Developing a new national model contract for OOH services Stronger performance management (including use of English and applying the performers list) Greater involvement of local GPs But we now have a new government … © Primary Care Foundation Developing the benchmark ● Awarded tender by DH in November 2007 ● Numerous pilots including across all of North East ● National advisory group to steer progress and set price ● Established three years support, with benchmark every six months and patient experience survey once a year ● Currently over 100 out of 152 PCTs in England are members © Primary Care Foundation Developing the benchmark: rounds 1, 2, 3, & 4 ● First benchmark completed March 2009 with reports on 63 services and half-day workshops for commissioners & providers ● Second benchmark, with reports on over 90 services, completed November 2009,with first patient experience survey managed by our partners, CFEP UK Surveys ● Third benchmark reviewing performance at period of peak demand at Christmas 2009 and New Year 2010 – to be completed by end July 2010 ● Fourth benchmark, again a full benchmark including patient experience – to be completed October 2010. © Primary Care Foundation How does it work? ● Data extract – most from one information system but now working with a number of others ● Web based questionnaire for commissioner ● Web based questionnaire for providers ● Validate data ● Produce reports ● Workshops ● Anonymity – about to change ● Steering group and user group © Primary Care Foundation 12 headline indicators Cost 1. Cost per head 2. Cost per case Productivity 3. Number cases per clinician per hour Outcomes 4. Referrals to hospital (if possible, subdivided between referrals to A&E and referral to a hospital bed) 5. Overall breakdown of dispositions (advice/PC Centre/home visit) 6. % Calls classified Urgent on receipt Process 7. The quality of clinical governance systems and processes Performance 8. Time to clinical assessment for all calls as a %age 9. Time to face to face consultations for urgent calls (including % urgent after assessment) Patient Experience 10. Patient experience of receiving telephone advice 11. Patient experience of treatment at a centre 12. Patient experience of home visits © Primary Care Foundation The evidence suggests … Out of hours services are improving … despite what you might hear in the media Most providers have made a rapid transition from ‘rota organising clubs’ into true healthcare providers. In doing so they have got much better at: ● Matching capacity to predictable demand, giving ample time for clinicians to do their work well ● Meeting performance standards ● Introducing governance processes to ensure a consistent and safe response to patients ● Engaging local clinicians in the service. © Primary Care Foundation A rapid response matters to patients ● Patients value a responsive service and associate this with good care. There is a wide difference between wide the difference is between the responsive and the comparatively slow. © Primary Care Foundation There is a clear relationship between IPSOS Mori respondent’s view of speed of response and the rating for the care received 85% Rating of care received either good or very good 80% 75% 70% 65% 60% 55% 50% 45% 40% 40% 45% 50% 55% 60% 65% 70% How quickly care was received % About right Each dot is one PCT © Primary Care Foundation 75% 80% 85% Seven years on, most providers are still falling short on a key NQR Many providers are falling short on the standard for definitive clinical assessment of urgent cases which we see as an important issue of patient safety. © Primary Care Foundation We reported the percentage of urgent cases that were assessed in 20 minutes… 100% 90% 80% 70% 60% 50% 40% 30% Increasingly falling below standard 20% 10% 0% Each bar is one service – a provider/PCT © Primary Care Foundation There is a very striking variation between services in the proportion of cases identified as urgent on receipt 70% 60% Percentage of cases identified as urgent by non clinical callhandlers 50% 40% 30% 20% How safe? 10% 0% Each bar is one service – a provider/PCT © Primary Care Foundation How safe? Coding needs to be improved … In far too many services it is impossible to be sure how many patients make their way towards hospital 25% 20% We know that many services, particularly to the left, are undercounting patients going towards hospital 15% Normal band? 10% 5% not credible? suspect 0% Each bar is one service © Primary Care Foundation What is quality in OOH? Quality is likely to be a composite measure of a number of these factors. Our conclusion is that those that perform well on all these factors are far from being the most expensive, but also that the very cheap providers do not appear to have the management headroom to perform consistently enough to feature in this group. © Primary Care Foundation Using this measurement of productivity to drive improvements in care ● An example: an out of hours provider who were part of earlier pilots ● Concerned that productivity was low ● Looked at productivity by each clinician – reported this back and reviewed performance with clinical manager ● Also looked at other factors. Identified some doctors regularly late for sessions and others not picking up calls when no visits at centre. ● Results included: ● Productivity more than doubled ● Clinicians happier that workload was more evenly spread ● ‘by making clinicians more productive - supporting them as necessary, sorting out the problems that they face and addressing one or two poor performers – it has improved care for patients because clinicians can focus on the job that they are there to do’ ● Learned that variations in performance tend to be less about external factors (e.g. geography, demography) and more about how staff are supported and managed. © Primary Care Foundation Improving Patient Safety - responding to low level of urgent cases on receipt ● Concern about benchmark results led to a rigorous base line audit of calls taken and priority given. ● Call handlers clear about life threatening calls & A&E referrals ● Other specific areas identified that could be addressed by training designed to develop each call handler’s confidence and knowledge ● Results included: ● post training audit showed that % of urgent calls has increased and is moving towards the national average ● more importantly, has shown to be appropriate to each presenting case as evidenced by the end priority given by the consulting clinician ● Supported and reassured call handlers - benefitted from extra training and comparing how they work with others ● Better identification of urgent needs improved patient safety © Primary Care Foundation From variation across organisations to variation between clinicians ● There is substantial variation within a typical service between individual clinicians. The response will often be shaped more by who deals with the case rather than the details of the case itself. ● Developing a consistent, safe and appropriate response does not just involve looking at the outliers, but involves consistent feedback to individuals comparing them with their peers so that they can identify specific things that they might do differently for the benefit of patients and the service. © Primary Care Foundation Future Changes … For services ● All services need to ensure that they are using the results work out how to improve local care – it is about using national comparisons to drive local improvements ● Recent reviews have highlighted the importance of good recruitment, induction, training and continuing support of staff. ● Some services need to make sure that they are responding to calls more rapidly than is currently the case For the OOH Benchmark ● The benchmark will extend to cover all these areas ● Making the benchmark more open and transparent will ensure that it is more useful to services as a tool for driving improvements ● Creating a new governance group as well as a user group © Primary Care Foundation Key Issues for the future ● Patient Safety ● A new initiative for rapidly sharing learning? ● tighter rules or a cultural shift? ● Focus on learning and improvement ● responding to benchmarking and other comparisons across and within organisations ● Better internal scrutiny – good governance and independent NEDs ● Greater openness and transparency ● Working as part of an integrated system ● Networks and accountability ● Three Digit Number ● Clarity for the public and patients about using urgent care services ● Commissioning for quality ● Commissioning pathways ● identifying the cost of quality in urgent care services © Primary Care Foundation Discussion & Questions And for more information, visit our website at: www.primarycarefoundation.co.uk Or contact me: Rick Stern 07709 746771 rick.stern@primarycarefoundation.co.uk © Primary Care Foundation