Outcome

advertisement
Demonstrating the Impact of
Careers Guidance
Lester Oakes
President IAEVG
Karen Schober
Vice-president IAEVG
Bryan Hiebert
Vice-president IAEVG
1
A Challenge from Policy Makers
You say you are providing effective services
We believe you
BUT
Show us the evidence
2
Policy-Practice-Research Dialogue
1. 1999: First International Symposium on Career
Development and Public Policy
2. Establishment of International Centre for
Career Development and Public Policy
• IAEVG has been an active supporter of the
International Centre
3. International symposiums held in Canada,
Australia, Scotland, New Zealand
4. 2009: Sixth International Symposium on Career
Development and Public Policy
5. Predominating theme was Prove it Works
3
Overview
IAEVG has been an active participant in past
symposiums and outcome-focused, evidencebased, practice has been an important part of
IAEVG strategic planning
• Update from European Lifelong Guidance
Policy Network
• Update from Canadian Research working
Group on Evidence-based Practice in Career
Development
4
Karen Schober
will update us on what is happening in the
European Lifelong Guidance Policy Network
5
Outcome Focused Evidence-Based Practice
Input  Process  Outcome
Framework developed by the
Canadian Research Working Group
on
Evidence-Based Practice in Career Development
6
Outcome-Focused Evidence-Based Practice
Input

Process

Outcome
Indicators of client change
1.Learning outcomes
• Knowledge and skills linked to intervention
2.Personal attribute outcomes
• Changes in attitudes,
• Intrapersonal variables
(self-esteem, motivation, independence)
3.Impact outcomes
• Impact of #1 & #2 on client’s life,
e.g., employment status, enrolled in training
• Societal and relational impact
• Economic impact
7
Outcome-Focused Evidence-Based Practice
Input

Process
 Outcome
Activities that link to outputs or deliverables
Generic interventions
• Working alliance, microskills, etc.
Specific interventions
1. Interventions used by service providers
• Skills used by service providers
• Home practice completed by clients
2. Programs offered by agency
3. Involvement by 3rd parties
8
Evidence-based Outcome-focused Practice
Input 
Process
 Outcome
Resources available
1. Staff
• Number of staff, level of training, type of training
2. Funding
• Budget
3. Service guidelines
• Agency mandate
4. Facilities
5. Infrastructure
6. Community resources
9
Outcome-Focused Evidence-Based Practice
Input

Process

Outcome
Intervention
=
Process + Outcome
What will I do? + How is it working?
Professional Practitioner
10
Evidence-based Practice
1. Research trials (traditional way in psychology)
2. Professional Practitioner
• Purposeful intervention
• Data (= evidence) on what was done
• Evidence on client change
• Look for patterns in data
linking intervention with outcome
• Develop scientific attitude toward practice



Skepticism
Curiosity
Inquiry
11
Intervention Planning & Intervention Evaluation
Intervention Planning Framework
Context:
Client
Needs

1
Client Outcomes
Client
Counsellor
Client
• Knowledge



Goals
Strategy
Strategy
• Skills
• Attributes
• Impact
3
2
12 12
Intervention Planning & Intervention Evaluation
Intervention Planning Framework
Context:
Client
Needs

Client Outcomes
Client
Counsellor
Client
• Knowledge



Goals
Strategy
Strategy
• Skills
• Attributes
• Impact
Intervention Evaluation Framework
Inputs

Processes

Outcomes
13
Quality of Service Delivery
1.
Accessibility
•
•
•
•
•
2.
Regular hours
Extended hours
Physical accessibility
Resources in alternate
format
Ease of access, who can
access
Timeliness
•
•
•
3. Responsiveness
•
•
•
Respect from staff
Courteous service
Clear communication
4. Overall satisfaction
•
•
% rating service good
or excellent
% referrals from other
clients
% calls answered by 3rd ring
Wait time for appointment
Wait time in waiting room
Need to negotiate these with funders
14
Comprehensive Service Evaluation
Quality Service framework
Service delivery
 Client volumes
 Client presenting problems
 Number of sessions
Service standards
 Staff credentials, competencies, resources
 Efficiency (at client needs being met?)
System requirements
 Adherence to mandate
 Completion of paper work
 Cost-effectiveness
15
Negotiated outcomes
 If you are lucky, funders might identify personal
attributes [client motivation, improved job satisfaction,
increased self-confidence] or knowledge, or skills, as
accountability indicators
 BUT more likely funders will identify impact outcomes
[employment status, enrolment in training, reduced # of
sick days, increased productivity, etc.] or inputs [client
flow, accessibility, timeliness of paper work, etc.]
 So service providers need to identify the knowledge,
skills, personal attributes that will produce the impacts
and negotiate these as accountability indicators
Be careful what you promise to deliver BUT
deliver what you promise
Promise small – DELIVER
BIG
16
Assessment as Decision Making (vs. Judgement)
Please use a two-step process
1. Would you say that your level of mastery of
the attribute under considerations is
unacceptable
00
1
acceptable
2
3
44
1. Then assign the appropriate rating





0 = really quite poor
1 = just about OK, but not quite
2 = OK, but just barely
4 = really very good
3 = in between barely OK and really good
17
Problem with skill self-assessment
Participants asked to rate their skill
(or knowledge) before and after a program
Often, pre-workshop scores are high
and post-workshop scores are lower
• People find out as a result of the workshop that they knew
less than they thought or had less skill than they thought
• Based on the new awareness, post-scores are lower
People don’t know what they don’t know
How can we get around this problem?
Assessing Learning & Attribute Outcomes
Post-Pre Assessment
We would like you to compare yourself now and before
the workshop. Knowing what you know now, how
would you rate yourself before the workshop, and how
would you rate yourself now?
Please use a two-step process:
• Decide whether the characteristic in question is
acceptable or unacceptable, then
• assign the appropriate rating
unacceptable
00
1
acceptable
2
3
44
19
CRWG: Ongoing Projects
1. Validate framework and approach
• ACT evaluation
2. Field test interventions
• Develop evaluation component as part of
intervention
• SME project
3. Field tests of practitioner use
• LMI project
4. On the horizon
20
Applied Career Transitions Program
Module 1
Building Career
Foundations
Module 2
Developing Career
Opportunities
Module 3
Getting Experience:
The Internship
• On-line Program Curriculum
• Program Access Options
On-line only
Coached (four coaching appointments per module)
Coached with In-class Group Sessions (four coaching
appointments and four group sessions per module)
21
Results: Post-Pre Assessment
For Module 1
•
•
•
•
•
All together there were10 (items) x 29 (participants) = 290 ratings
Pre: 144 Unacceptable Ratings and 6 Exceptional Ratings
Post: 3 Unacceptable Ratings and 130 Exceptional Ratings
Exceptional Ratings increased from 2 to 44% of the participants
Pre: 50% Unacceptable Ratings; Post: 86% Acceptable Ratings
22
Results: Impact outcomes
Module 1
• 23 out of 29 had found a job
• 10 of the jobs lined up well with career vision
Module 2
• 4 out of 6 had found a job
• 3 of the jobs lined up well with career vision
23
Attribution for Change
To what extent would you say that the changes
depicted above were the result of completing
Module 1 of the ACT program, and to what extent
were they a function of other factors in your life?
mostly
other
factors

somewhat
other
factors

0
0
uncertain

0
somewhat mostly
this
this
program program


10
19
24
WSI Project
Career Development in SMEs
Career Development interventions
for Small and Medium Enterprises to
promote personal ownership for career planning
and within-organization career mobility
Three types of intervention
 Minimal: Web-based self-help intervention
 Moderate: On-the-job career conversations with managers,
supervisors, colleagues
 Intensive: Bilan des competences
25
WSI Project
General approach
1. Literature review
• What is know already about the three areas
of intervention
2. Formal needs assessment
3. Design intervention AND evaluation plan
4. Field test intervention
5. Consolidate evaluation evidence
6. Final report
7. Market results
26
WSI Project
General results
1. All 3 interventions worked
2. Different programs
• designed for different purposes
• requiring different resources
• Chose the approach that best meets
organizational needs
3. It’s better to do something than nothing
27
Practitioner Field Tests
1. Use and impact of Labour Market Information (LMI)
• Isolate LMI from other interventions
2. Participant research approach
• Normal clients seeking service
• Agencies offering services
3. Use Post-Pre approach
28
Research Design
Intervention
Job Search
CDM
Time 2
Time 1
Delivery
29
LMI: General Results
1. All intervention-delivery combinations produced
significant change
•
•
•
•
General ability to access and use LMI
Knowledge about how to use LMI
Skills for using LMI and taking action
Personal attributes, e.g., optimism, confidence
2. Assisted use produced greater change across
time than independent use
3. 80% of clients attribute change to the program
and not other factors
30
On the Horizon
Common Indicators Project
1. Identify 3-5 indicators of success that all
agencies will collect
• Focus groups to identify outcomes and data
source
2. Aggregate results across agencies
3. Increase power of results
31
Resources Available
• Major report on evaluation practices
• Special issue of Canadian Journal of Counselling
• Sample tools and data gathering instruments
• Evaluation workbook
• Presentation notes, power point slides, etc.
• Survey/needs assessment for self-help intervention
http://www.crwg-gdrc.ca
32
To demonstrate value, we need to develop
Culture of evaluation:
We need to reach the state where
•
Identification of outcomes is an integrated part of
providing services
 Without efficacy data, career services are vulnerable
 It is in our best interest to gather evidence attesting to
the value of the services we provide
• Measuring and reporting outcomes is integrated into
practice
• Outcome assessment is a prominent part of counsellor
education
• Reporting outcomes is a policy priority
This needs to be a priority in all sectors
33
Demonstrating the Impact of
Careers Guidance
Questions and Comments
Lester Oakes
President IAEVG
Karen Schober
Vice-president IAEVG
Bryan Hiebert
Vice-president IAEVG
membership@iaevg.org
34
Download