Success in the workplace

advertisement
Success in the workplace
Performing to make a difference to
children and their families
Jo Fox Child Centred Practice
How much do you understand
about…..
Organisation
• Definition of success
• Definition of failure
Front line
staff
• Definition of success
• Definition of failure
Service user
• Definition of success
• Definition of failure
What does success look like to you?
How much have you thought about being successful in your workplace?
Do you have an internal script for success?
Do you expect to be successful?
Do you expect to be recognised for being successful?
Whom by?
What does success look like to service users?
What expectations do they have of us?
What expectations do they have of themselves?
Do we understand and talk about each others purpose enough?
Understanding failure
• What is your failure
policy?
• How do you define
failure?
• What does failure mean
to you?
• How do you use failure
?
• What is your
organisations failure
policy?
• How is failure defined?
• What does it mean?
• How do they use
failure?
Where do our goals
overlap?
•How strongly do you identify with
your organisations goals and values?
•Are you proud to be a part of your
organisation?
•Are the functions you carry out what
the organisation and service user need
you to do?
•Do you believe that your task is
worthwhile and significant?
•What benefits will you gain personally
from achieving your task
professionally?
•Does it matter to you if your
organisation fails?
•Does it matter to your organisation if
you are successful at carrying out your
tasks?
•Does it matter to the service user if
you are successful in your work or not?
HOW DO YOU KNOW?
organisation
Service
user
own
Think about your most valuable failure
What did it tell you about yourself?
About the people you worked for?
About the people you worked with?
Links between performance and
outcomes
The 60’s
• After WWII local authorities responded to the
fact that some areas required services that cut
across the traditional departments
• Councillors were concerned with budgets and
performance was the domain of the
professional – standards and accountability
maintained through professional integrity and
scrutiny of any kind seemed like an affront.
The 80’s
• Audit commission started in 1983
• In 1986, it published a well regarded tool kit
for Performance Review which introduced the
idea of the three E's (economy, efficiency and
effectiveness).
• Councils were encouraged to compare value
for money and performance against a range of
indicators - though these were mainly cost
based at this stage.
The 80’s
• It was also a period which saw an increased
focus on what Professor John Stewart
(Birmingham University's INLOGOV) described
as the "wicked issues" - demands and
problems which cut across traditional
departmental and service "silos" and which
demanded a more corporate approach.
The 80’s
• Some councils (a notable example being the London
Borough of Bexley) introduced systematic processes for
corporate planning and review.
• Typically, these included a high level corporate plan (setting
out the main council aims and priorities) supplemented
with service plans reviewed annually.
• Many councils set up performance review committees
(prompted by local auditors and the work of the Audit
Commission) - although it was many years before these
became an effective agent of change within organisations.
The late 80’s
• The next decade saw an increasing challenge to the traditional
belief that council services were delivered to people by professional
experts whose judgements should not be questioned.
• In the late 1980s, York City Council pioneered the idea of published
"citizen charters" - setting out, in plain language, the standards of
service that users should expect and what to do if services failed to
meet expectations.
• This idea was taken up by John Major's Government which, in 1993,
launched a national citizen's charter movement incorporating a
Chartermark award scheme and, for the first time, a set of statutory
performance indicators (organised and validated by the Audit
Commission).
• This same period also saw a massive expansion
across local government in the use of market
research to explore attitudes of the local
population to council services and the way they
were delivered. This had previously been resisted
by many councillors who saw it as a threat to
their traditional view of themselves as the voice
of the local community.
The noughties
• The early part of the twenty first century can be seen as a logical
continuation of these trends; the citizens' charter and compulsory
competitive tendering (CCT) morphing into the best value regime and then
onto the comprehensive performance assessment (CPA).
• Gradually, there was growing awareness that Stewart's "wicked issues"
were not going away - indeed they went far beyond the collective powers
of the entire council and required cooperative working with outside
agencies such as the health service, police, voluntary and business sectors.
CPA became (albeit briefly) the comprehensive area assessment (CAA).
• The Total Place initiative has started to explore in depth the extent to
which public bodies (both national and local) have the same clients and
objectives and, individually, have considerable resources yet lack the
organisational and political capacity to operate in an effective "joined up"
way to deliver services which are cost effective and which meet service
user and community needs.
The new
• The coalition government has dismantled the
old performance frameworks, the audit
commission is to be abolished.
• In it’s place an expectation that:
• Local authorities are responsible for their own
performance and for leading the delivery of
improved outcomes for their area
• Local authorities are accountable to their local
communities
Context - preparing for leaner times
•
•
•
•
•
•
•
•
In March 2010, the Audit Commission summarised the challenge facing councils, over the next
decade or so, as:
"The financial impact of the recession has been manageable for most councils up to January 2010.
The government has honoured the three-year grant settlement up to 2010/11; on average, grant is
two-thirds of council income.
Staff pay increased by 1 per cent in 2009/10, less than expected.
Many councils received a windfall VAT refund.
Many councils, especially districts, have enough reserves to cover short-term funding pressures.
However, some councils – often districts – have been hit hard by falling local income.
Development-related income has reduced; planning applications are down by 22 per cent.
Investment income fell by £544 million (43 per cent) in 2008/09, and the fall continued in 2009/10.
Capital receipts are down from over £3.5 billion in 2007, to just £800 million in the first three
quarters of 2009.
Some districts that rely heavily on local income are struggling.
Each individual to think carefully about how they are
spending their time and resources.
Performance is no longer something done by ‘funny little people in small
dark cupboards that has nothing to do with the real world”. It is what you
do everyday to ensure your efforts do not go to waste.
The benefits and risks
• Good councils will flourish as they focus on local
priorities with less national prescription; but
coasting or defensive organisations may flounder
• We will have to publish a lot of data, but will be
set far fewer associated national targets – again
this is an opportunity for good councils but also a
threat if councils do not rigorously drill down on
key performance metrics
BURNING
PLATFORM
Apathy &
complacency
VISION
Lack of direction or
coherence so change
fizzles out
LEADERSHIP
Poor alignment
& inertia
CAPACITY &
CAPABILITY
Anxiety &
frustration
COMMUNICATE
& ENGAGE
People feel the
change won’t
affect them
OWNERSHIP
AT ALL LEVELS
Poor design that
won’t last
QUICK WINS
Cynicism that
change is possible
& disbelief
PERSONAL
IMPACT
Lack of individual
commitment
EMBED CHANGE
SO IT’S BUSINESS
AS USUAL
Revert to the
old ways
Key: ESSENTIAL FOR CHANGE
Symptom of missing piece
20
Encouraging Innovation
Resources
Risk taking
Targets
Information
Emotional support
Funding
Wide scope search
What, but not how
Balanced
assessment
Time
Uncensored,
unfiltered,
unsummarised
Specific call for
innovation
Free-flowing
‘Stretch’
Authority to act
Learning from
failure
Tie to strategic plan
Trying new things
Clear case for need
Flexibility
Process
Training
Encouragement
for skills
development
Tools
Aligned with
organisational
goals
Recognition
Honouring
everyone’s input
Diversity
Intrinsic motivation
Trusting, open
environment
Individualised
Team based work
Rewards
Relationships
21
Key performance indicators
• Can these be usefully re-framed as
“is what I do each day making a good difference
to the lives of children and their families?”
• What would your key performance indicators
look like?
• What evidence base do you have for your
performance indicators?
Unpicking indicators
The indicator
• Completing an initial
assessment within 10 days
The evidence
• Messages from research
• Poor decision making
• Children in drift
• No services only assessment
• Child protection action
taken without
understanding issues and
needs of child
How do we measure things?
Outputs
Outcomes
Indicators
Targets
Outcome or output?
• Outcomes are end results.
They can describe different
aspects of wellbeing for whole
populations – for example, all
children, as with the Every
Child Matters (ECM) outcomes
– or they can refer to the
wellbeing for users of a
particular service or
intervention over time.
Examples are a safe
community, a clean
environment, or a reduction in
the number of looked-after
children. These are outcomes,
not outputs.
• Outputs describe service
specifications, delivery
mechanisms and procedures.
For example, a successful
parenting support programme
might deliver a specific
number of training sessions
and increase the number of
trained facilitators and
participating parents. These
are outputs, not outcomes.
Measuring outcomes instead of
activity
• What should the
measure be?
• Who should we ask?
• What should we ask ?
• When should we ask?
• How can we share the
results?
IS IT A OUTCOME, INDICATOR OR
PERFORMANCE MEASURE?
1. Safe Community
2. Crime Rate
3. Average Police Dept response time
4. A community without graffiti
5. % of surveyed buildings without graffiti
6. People have living wage jobs and income
7. % of people with living wage jobs and income
8. % of participants in job training who get living
wage jobs
The difficulty with assigning value
Economic rationalist approach to understanding outcomes believes that
everything can have a value attached.
What do we need to measure the right
things?
• Left Brain
•
•
•
•
•
•
•
•
Bureaucracy
Systems
Politics
Planning
Rule compliance
Analysis
Formality
Order
• Right Brain
•
•
•
•
•
•
•
•
Autonomy
Ownership
Risk taking
Rewriting rules
Informal
Synthesis
Intuition
Trust
“Not everything worth counting can be
counted; and not everything that can be
counted counts”
MEASURING OUTCOMES NOT
ACTIVITY
What are you counting
• How often do you measure what you do each
day?
• How often do you measure the work of your
colleagues and peers?
• How much faith do you have in the tools you are
using?
• What would you do differently?
• How much responsibility are you taking for
ensuring that you understand what ‘doing a good
job’ means.
The hidden dynamics
Behaviour goals- visable
Behaviour blocking goals – Hidden motivation to
not visable
maintain blocking
behaviours
Be more receptive to new ideas
Giving curt responses to new ideas
–closing off, cutting off, or overruling tone
To have things done my way
Be more flexible in my response
especially to new structures and
ideas
Not asking open ended question
or seeking opinions of others
enough
To experience myself as having a
direct impact
Be more open to delegating and
supporting new lines of authority
Communicating with others too
much, too frequently, that they
have to get back to me
To feel the pride of ownership to
see my stamp
Being too quick to give an opinion
when it has not been asked for
To preserve my sense of myself as
a super problem solver
Other models for valuing outcomes
• Appreciative inquiry is a strategy for intentional change that
identifies the best of 'what is' to pursue dreams and
possibilities of 'what could be'; a cooperative search for the
strengths, passions and life-giving forces that are found
within every system and that hold potential for inspired,
positive change.
• It is a process of collaborative inquiry, based on interviews
and affirmative questioning, that collects and celebrates
'good news stories' of a community; these stories serve to
enhance cultural identity, spirit and vision.
• Appreciative inquiry is an approach which focuses on a
desired future or outcome and is different from a problemsolving approach.
Appreciative inquiry
• Its four guiding principles are:
1. every system works to some degree; seek out the positive,
life-giving forces and appreciate the best of “what is'.
2. knowledge generated by the inquiry should be applicable;
look at what is possible and relevant.
3. systems are capable of becoming more than they are, and
they can learn how to guide their own evolution – so
consider provocative challenges and bold dreams of 'what
might be'.
4. the process and outcome of the inquiry are interrelated
and inseparable, so make the process a collaborative one.
How would you make this work in
Children’s Services?
What gets in the way
What helps
Cost benefit analysis in social care
Benefits
• Knowing how to allocate
precious resources
• Able to move services
flexibly with the child and
families needs
• Commission effectively to
meet the community needs
challenges
• Count the wrong thing
• Confuse cost with value
How could you make this work in
Children’s Services?
What gets in the way?
What makes it work?
Outcome based results
drawbacks
• It is sometimes hard to
evidence outcomes
• The results of the work we
do can take years for the
child to show
developmentally
• We can miss the underlying
causes if we always treat
the symptom – short term
gain only
strengths
• It makes us work to
something tangible
• It should result in changes
that are valuable to the
child and their family
How would you make this work in
children’s services
What gets in the way
What helps
The most significant change
• What ‘soft data’?
• How do we collect it?
• What gets in our way?
Most Significant Change
The most significant change (MSC) technique is a form of participatory monitoring and
evaluation. It is participatory because many project stakeholders are involved both in
deciding the sorts of change to be recorded and in analysing the data. It is a form of
monitoring because it occurs throughout the program cycle and provides information to help
people manage the program. It contributes to evaluation because it provides data on impact
and outcomes that can be used to help assess the performance of the program as a whole.
•
•
•
•
•
•
When to use:
Program evaluation.
Organizational review and evaluation.
Building community ownership through participatory evaluation.
How to use:
The process involves the collection of significant change (SC) stories from the field level, and
the systematic selection of the most important of these by panels of designated stakeholders
or staff. The designated staff and stakeholders are initially involved by ‘searching’ for project
impact. Once changes have been captured, various people sit down together, read the stories
aloud and have regular and often in-depth discussions about the value of the reported
changes. When the technique is successfully implemented, whole teams of people begin to
focus their attention on programme impact.
From: http://www.mande.co.uk/docs/MSCGuide.pdf by Rick Davies and Jess Dart
These ten steps are usually included:
1. Raising interest at the start.
2. Defining the domains of change.
3. Defining the reporting period
4. Collecting SC stories.
5. Selecting the most significant of the stories.
6. Feeding back the results of the selection process.
7. Verifying the stories.
8. Quantification.
9. Secondary analysis and meta-monitoring.
10. Revising the system.
Download