Informing Practice through Quality Assessment July 2006

advertisement
Informing
Practice
Quality Assessment July 31, 2006
By the end of this
session, we will …
• Participate, learn and have fun!
• Answer,
– Why is it important to ask?
– How do we “inform our practice” through
four stages?
– Will assessing make a difference?
– Do I have the skills to begin successfully
and feel good about what I am doing?
Why Bother?
• If you always do …
• Our need to know. . .
– How are we doing?
– How well are our students doing?
– How do we know?
– Have we made a difference?
– Have we met our goals?
• Answer in a systematic way:  credibility
Why Bother?
Asking:
• Provides “informed answers”
- Speak knowledgeably to answer, “How do
you know?”
• Demonstrates that we are serving our
students:
– Interested in knowing if we delivered what we
promised;  Listening
– Gathering evidence to use to improve
• Contributes to our own learning
Natural Process
Asking after an event occurs:
• What was good about it?
• What was not so good?
• What will we do different
next time? Why?
Guiding Principles
• There is no single “right answer”
• Process of learning together
• Value added process through synergy
Informing Practice
1. Setting
Measurable
Goals
2. Planning
to Reach
Goals
3. Data
Collection
4. Data
Analysis,
Reporting,
and Action
Assessment
Cycle
Mission
Statement
Strategic
Goals
1. Setting
Measurable Goals
Planning Questions
Goals
What do we want our students
to be able to know and do?
Students will participate in an effective
experience that develops their
interpersonal and leadership skills.
What are observable and
measurable outcomes
(behaviors to track) that will let us
know what our students know
and can do?
Students will be able to rate the two
aspects of the experience and
demonstrate an example of applying
their skills of collaboration and
evaluating their leadership actions.
What tasks will students
engage in that demonstrate
what we expect of them?
Students will engage in experiences
with two aspects where they will apply
their skills of collaboration and
evaluate their leadership actions.
What tool is used to measure
the indicator?
Tools may be a survey, interview,
observation checklist, etc. based on
outcomes.
2. Planning to
Reach Goals
• Advice from “The expert”
• Review goals
• Data collection design
• Data analysis, reporting, and action
• Set future goals
3. Data
Collection
Planning for Success:
• Purpose
• Process:
– Who?
– What?
– How?
– When?
• Lessons Learned
4. Data Analysis,
Reporting, and Action
Results
• Analyze data to learn what was said
• Report and communicate to
“Close the Loop”
• Action plans for the future
Data driven decision making
Completing the Cycle
3. Data
Collection review
Planning for Success:
• Purpose
• Process:
– Who?
– What?
– How?
– When?
• Lessons Learned
Purpose
• Clearly write:
The purpose for our survey is to. . .
• State:
determine
discover
– What do we want to know?
– What we will do with the information; how we
will use the assessment results for
improvement?
Informs our Data Collection Design
Turns into Letter to Participants
Process – Who?
Who do we ask?
• As researchers, we cannot assume that
we know what everyone is thinking
• The ones who are able to answer the
questions from their perspective
Process – What?
• What do we ask?
– Review purpose statement
• Pilot:
– Do the questions work?
– What information will they give us?
– Will the information inform our decision
making?
– Be cognizant of time of participant
Process – How?
How do we ask to get information?
• Open-ended question format
• Close-ended question format
– Likert (feelings / attitude / opinion) scale of
1 to 5; 1 to 7; 1 to 4; others
– Yes / No answers
• Paper / electronic
• Focus groups
– Using scripts; recorder/ x-check; skilled interviewer
• Consider need
? Consent letter / Institutional Review Board (IRB)?
? Anonymous / confidential?
Process – When?
When do we ask?
• Immediately, or risk “time heals” syndrome
• Later, to benefit from reflection
• Check Survey Central
– Has it been asked before?
– Avoid “survey fatigue”
Lessons Learned
•
•
•
•
•
Critique survey examples
Response population analyzed
Letter to Participants
Pilot
Scales and ratings
Lessons
Learned
Critique
survey
examples
Lessons Learned
Response population analyzed
• What is a good response rate?
• Sample Size Calculator
• Population Profile
Lessons
Learned
Letter to
Participants
• Content
• Message
Lessons Learned
Write an excellent letter of invitation to participate:
• Identify self, why the survey is happening
• What will be done with the results
• Note changes made in past
• “Data will be reported in aggregate form only”
• Incentives?
• Signatory? Personal connection makes a difference
• Remember: Be empathetic
Lessons Learned
Pilot
• Is wording clear?
• Do the questions “work”?
– What information will they give us?
– Is the information meaningful?
• Be cognizant of time of participant
Lessons Learned
Likert Scale
• 5 point scale: mid-point cluster
• 7 or higher point scale: many choices
• 4 point scale: forced opinion
• Define each level
When rating, ask why. . .
• Rating was   only negative responses.
• Rating was   replicate the good.
Lessons Learned
Constructing questions:
• Short and clear. Avoid misinterpretations.
• Consider pairing statements to ensure
reliability and validity
• Avoid “and”. e.g. This limits statements to
a single issue.
• Pilot
Will assessing
make a difference?
Data has contributed to our need to know. . .
– We are…
– Our students are…
– We know…
– We made a difference in these ways…
– The goals we met are…
We have the evidence!
Informing Practice
Quality Assessment
July 31, 2006
Written and produced by
Halyna Kornuta
Download