Appendix 7 Analytical Toolkit for Evidence Utilization

advertisement
Appendix 7 Analytical Toolkit for Evidence
Utilization
In our full report, we argued that practitioners and policy groups need to
significantly rethink evidence utilization in commissioning. In this section we
provide analytical tools that may be helpful in allowing those involved in
commissioning to rethink how they can better use evidence.
Mobilising a plurality of evidence: Key Considerations
Our results indicate that practising evidence-based commissioning most
likely involves drawing upon a plurality of 'evidence'. Although one may not
be able to pre-determine exactly how an object will become evidence in
decision making, our research sheds light on promising ways to use
different forms of evidence.
In thinking about using evidence, practitioners may benefit from considering
different modes of utilization, as illustrated in the following figure.
commissioners may need to exercise judgement and scrutinise this
before deciding whether it is indeed 'evidence' that is fit for their
purpose. For example, using examples of best practice may need to be
done with care if these do not reflect local population needs. Below we
offer our thoughts on how to make use of 'best practice' more fruitful.
Currently, 'best practice' is disseminated in a variety of ways, some of these
are embedded in:






NSFs, departmental policy and target setting
Journals
Conferences
Reports from 'pathfinder' and 'pilot sites'
Through regional meetings and networks
Organisations such as the NHS Institute of Innovation and
Improvement through their literature, conferences and websites
As we showed, examples of best practices are used extensively in
commissioning. Commissioners need to be aware of possible challenges that
may arise from relying on and trying to mimic 'best practice'. In our case
studies some of these challenges were rooted in the following:

Differences between the environmental context of the site(s) of
the case study and the commissioning organisation using it. The
differences can be in the characteristics of the populat ion served;
the structure of local healthcare or social care organisations; the
preparedness of the organisation or individuals for change; the
presence or absence of influential local champions; and in the
cultures of the organisations involved.

The amount of information available from 'best practice' cases
varies: some do not include much financial or activity data, some
include examples of contracts, job descriptions, project plans and
other documentation which have been used.

Where best practice case studies result from national 'pathfinder'
sites, they have sometimes benefited from additional funding
and/or resources (e.g. expert advice from national leaders in the
field, preferential rates from IT companies in a 'quid pro quo' for
development of software and/or the availability of business
analysts).

It is not clear whether best practice case study sites are re-visited
to see whether new pathways have become embedded and
continue to live up to early expectations.
Co-producing Commissioning Solutions: Key Considerations
In addition to a plurality of evidence, commissioners may need to consider
planning processes of co-production more systematically. When thinking
explicitly about co-production strategies, practitioners may benefit from
addressing some of the questions proposed below:
>
>
>
>
>
>
>
What is the purpose of co-production?
What are the stakes of the people involved in a redesign project? Are
there vested interests e.g. could a party shape service design in
illegitimate ways? How important it is for us to leverage the expertise of
other parties?
How are we going to pro-actively manage conflicts of interest or
divergent objectives of multiple parties?
Are there disincentives for existing providers to change the service?
What incentives can we create to strengthen our negotiating position? Do
we need to build strong relationship with existing providers?
Do we want to agree with existing providers on a set of nationally
defined service specification? Or do we depend on other parties for
their expertise in the context of designing a good service?
Are we intending to go down the route of full procurement? If so, at
which stage should parties be excluded from discussions about a
service?
The following decision tree may also aid practitioners in their efforts to
design co-production strategies.
Managing Interdependencies in Co-Production
Co-production of evidence inevitably entails collaboration.
This means managing interdependencies between people
and processes, tasks and purposes, and sources of
information.

Process Interdependence Management
Furthermore, we suggest that designing evidence utilization
interventions may need to account for key dimensions of
interdependence management. Questions, such as the
following, may help practitioners find a way to address
important interdependencies:
 Do the people, involved in a redesign project, understand their roles
and responsibilities? Is there shared understanding of who is doing
what, when and why? Have roles and responsibilities been openly
discussed and agreed? Who is in charge?
 Are there project management arrangements in place, e.g. project
plan, timescales, resources, deliverables, outcomes? Are these suited
to the project's scope and aims? Do we need to enforce strict project
management in light of our objectives or not?
 Is there a well-understood formal decision making process in the
organisations? Who should endorse decisions? On what issue?
 What kind of expertise/knowledge will be needed for the success of the
project? Do we need information analysts, who can interpret complex
activity intelligence? Do we need contracts managers? Have we
consulted different experts about when we may need their input?
 Have we thought about how we can strengthen relationships within a
project group? Have we put effort into building trust with each other?

Task Interdependence Management
 What are the key dimensions that would make a commissioning
solution organisationally and more broadly acceptable? Can we prove
that our day-to-day activities align with our objectives?
 Have we thought about how the different stages of commissioning
interact, e.g. service redesign, procurement and contracting? Should
we involve information, finance, and contracts experts from the start of
service redesign? Do we need to make them core members of our
project group?
 Have we collected and understood evidence of cost and clinical
effectiveness of an intervention? What is a reasonable estimated time
period for an intervention to produce certain measureable outcomes?
Is it easy to evaluate?
Individual Funding Requests: Key Considerations
Finally, our research has important practical implications for designing
individual decision making processes.
As we highlighted in our empirical findings chapter, the interface between IFR
decision making and commissioning policies needs to be explicitly taken into
consideration. Commissioners need to consider very carefully how
commissioning policies will be developed and renewed in an ongoing fashion.
As our practitioners highlighted, unless due consideration is given to
organisational and governance arrangements, IFR decisions can become very
problematic and threaten organisational reputation.
In addition to considering the formal characteristics of IFR decision
making, practitioners may benefit from reflecting systematically on
the informal, yet consequential, processes of making sense of, and
deliberating on, the funding merits of IFRs. The following table may
help practitioners in this regard.
Ideas — questions for improving the process
Categorising
(Competencies) Do we pay sufficient attention to the ways we categorise requests? Is
there the right expertise to do so? Is there a mechanism to question the categorising of
cases? Are we getting 'comfortable' with the ways we categorise IFR requests?
(Information and evidence) Do we have sufficient information to define a case? Are
the IFR forms adequate? Is further interaction with the requestor needed?
Establishing
(Competencies) Is there the right expertise for this aspect? Do we have the right local
genuineness of
knowledge? Is there a mechanism to question the authenticity of a case? How much do
request
we rely on previous experience to determine genuineness?
(Information and evidence) Do we have sufficient information to define a case? Can
we refine the IFR forms? Is further interaction with the requestor needed to e.g. provide
opportunities for clarifications?
Assembling a
(Competencies) Do we have enough knowledge to assemble a compelling narrative? Is
narrative
the right expertise available to explore alternative interpretations? Do we fill a lot of the
gaps of the stories unconsciously? Do we have a mechanism to question a case story?
(Information and evidence) Do we have sufficient information to define a case? Would
we like requestors to provide clarifications? How do we support requestors to make their
submissions 'easier to follow'?
Constructing a
(Competencies) Does our panel have the diverse expertise to review the merits of a
public
case? Have we distinguished among multiple criteria for justifying our decisions? Is
Justification of
there a rigorous mechanism to scrutinise evidence for each criterion? Do we pay
decision
attention to the (lack of) requestor's skills to assemble and interpret evidence for their
cases? Do we provide opportunities for requestors to understand the grounds of our
decision? Do we allow 'external others' (non-panel members) to question the fairness of
our decisions? Have we looked at national guidance methodically?
(Information and evidence) Are our IFR rules robust enough? Have we involved all
stakeholders in developing our IFR policy? Do have a formal mechanism to assemble and
interpret evidence of clinical and cost effectiveness? Do we have a solid decision making
framework that helps us generate a thoughtful and legitimate response to a request?
Download