Assessment Kick

advertisement
WesternU Assessment Kick-off Meeting:
The why’s, who’s, what’s, how’s, and
when’s of assessment
2013-2014
Institutional Research & Effectiveness
Neil M. Patel, Ph.D.
Juan Ramirez, Ph.D.
Meeting Roadmap
• The goals are to understand
– Why assessment needs to take place
– Who should be involved in assessment
– What needs to be assessed
– How to assess the learning outcomes
– When assessment reports are due
Assessment Overview
• Why assess?
– Accountability
– To measure learning
– To identify challenges related to instruction, curriculum, or
assignments.
– To improve learning
• Methods must be in place to properly assess
• Information should be shared widely and used to inform decisionmaking
• Key Players
– Deans, Faculty, Curriculum committees, Assessment
committees, Assessment Specialists, Preceptors
What needs to be assessed?
PHASE
YEAR
1
2012-13
Evidence based practice
Interpersonal communication skills
2
2013-14
Critical thinking
Collaboration skills
2014-15
Breadth and depth of knowledge in
the discipline/Clinical competence
Ethical and moral decision making skills
2015-16
Life-long learning
Humanistic practice
3
4
INSTITUTIONAL LEARNING OUTCOMES
WesternU
Evidence-based Practice Assessment
WesternU
Interpersonal Communication Skills
Assessment
Assessable Learning Outcome(s)
Assessable Learning Outcome(s)
Evidence
Evidence
Assessment Participation
Assessment Participation
Assessment Goals
Assessment Goals
Pilot Phase
Methods for data collection
Methods for data collection
Year 1
Results
Results
Discussion
Discussion
Implications
Implications
OVERALL
OVERALL
0
1
2
3
0
1
2
3
What needs to be assessed? (cont.):
We cannot assess everything!
• Direct assessment of
Signature Assignments
– Signature assignments
have the potential to help
us know whether student
learning reflects “the ways
of thinking and doing of
disciplinary experts”
– Course-embedded
assessment
– Aligned with LO’s
– Authentic in terms of
process/content, “real
world application”
• Indirect assessment, i.e.,
Student perceptions
–
–
–
–
First year survey
Graduating survey
Alumni surveys
Student evaluation of
course
ILO ASSESSMENT TEMPLATE
Assessment Template
•
•
•
•
•
•
•
•
•
Timeline
Section I: Progress Report
Section II: Learning Outcome Alignment
Section III.1: Methodology
Section IV.1: Results
Section V.1: Discussion & Implications
Section III.2: Methodology
Section IV.2: Results
Section V.2: Discussion & Implications
Assessment Template
•
•
•
•
•
•
•
•
•
Timeline
Section I: Progress Report
Section II: Learning Outcome Alignment
Section III.1: Methodology
Section IV.1: Results
Section V.1: Discussion & Implications
Section III.2: Methodology
Section IV.2: Results
Section V.2: Discussion & Implications
Section I: Progress Report
• Goal: To document what occurred as a result
of 2012-2013 assessment .
Section II: Learning Outcome Alignment
• Goal: To determine which PLO’s align with the ILO,
and, to determine, over time, which PLO’s are not
assessed.
Section III: Methodology
• It will be necessary to copy and paste sections III-V if there
are more than two assessments completed.
• Every ILO report needs to include one direct and one
indirect assessment-Multiple assessments may be
necessary to cover ALL PLOs.
Section III: Methodology
Section III: Methodology
Section III: Methodology
• Note: Participation section is for participation in
assessment process not for the participation of the
development of the student work
Section IV: Results
• Analytical approach
– Should align with assessment goal
– To determine how many students are achieving at a specific
level/score: Frequency distribution
– To determine if differences in scores exist between two or more
groups: chi-square, t-test or ANOVA
– To determine if scores from one assignment predict scores of
another assignment: Regression
• Sample size: number of students assessed
• Statistical results: Frequency table, p value, Etc.
Section V: Discussion & Implications
EXAMPLE
Example
Scenario: Following a discussion between
faculty, Curriculum Committee, the Program
Assessment Committee and the Dean, it was
decided Critical Thinking will be assessed using
4th year preceptor evaluations.
Question: What do we need to do?
Example: 4th year preceptor evaluations to
assess Critical Thinking
• Things to consider:
– Which PLO(s) are assessed?
– How is the assessment scored?
– Who has the data?
– What is/are the quantifiable assessment goals?
• Standards of success
– How do we analyze the data?
Example: 4th year preceptor evaluations to
assess Evidence-Based Practice
• Assessment: The preceptor evaluation of students occurs
during various time points within the 4th year rotations. For
the purpose of assessment, the program has decided to use
the students’ entire 4th year preceptor evaluations (eight
evaluations in total). The preceptors are asked to indicate
using a Yes/No format if a student has been observed
demonstrating a list of certain skills or has been observed
displaying certain knowledge elements; there are 20 total
items in the evaluation form. These elements are
commonly displayed within the profession. The data is sent
directly to the 4th year Director. To assess Critical Thinking,
a single item within the checklist is used: The student
utilizes and displays critical thinking.
Example: 4th year preceptor evaluations to
assess Critical Thinking
• Assessment Goal: 90% of students will
demonstrate critical thinking skills.
• Why did we come up with 90%?
– Peer or aspirational college has similar standard
– Professional community suggests such standard
– Our own data has set the standard
• The assessment goal is different than grading
– For grading, passing = 70%; 14/20; “Yes” = 1 point
– It is possible for all students to score 0 on the Critical
Thinking item.
Averaged data of 4th year preceptor evaluations
assessing Critical Thinking per student
Student
CT Score
Student
CT Score
1
1
11
1
2
1
12
.5
3
1
13
1
4
.5
14
1
5
.75
15
1
6
1
16
.375
7
1
17
.5
8
1
18
.375
9
1
19
1
10
.5
20
1
CT Score: 0 = no, 1 =yes
Example: Section III.1 Methodology
Name of assessment:4th year preceptor evaluation
Evidence: Indicate if this is a direct or indirect assessmentDirect
Evidence: PLO(s) assessed (Section II)- List the PLOs that will be assessed by this particular
assessment.
PLO 2: Upon graduation, students should be able to think critically when in the clinic.
Evidence: Description: Please write a narrative that explains the student work completely so that
someone who knows nothing about the program will understand what it consists of and include how
the assessment addresses the PLO(s).
Preceptors indicate using a Yes/No format if students are observed demonstrating a list of certain
skills or display certain knowledge elements; there are 20 total items in the evaluation form. Eight
rotations during the 4th year were used, and scores were averaged for each student.
Data Collection Method: How is the assessment scored? State the type of scoring mechanism used.
Yes/No scoring guide.
Data Collection Method: Does the data isolate the PLO? Yes or No
Yes
Example: Section III.1 Methodology
Data Collection Method: Provide the scoring mechanism as an attachment, as well as any
other important documents for this assessment- State the title of the attachment(s) and what
each one includes. If applicable, please highlight what specifically is being utilized for
assessment within the attachment.
Single item: The student utilizes and displays critical thinking.
Please state the quantifiable assessment goal: Assessment is completed to determine how well
students are achieving the PLOs. For example, a goal may be written to determine how many
students are achieving at a specific level/score. There can be more than one goal for each
assessment. For example, if students are reaching a particular score, and, if current students
are performing differently from previous students.
90% of students will demonstrate critical thinking skills in all eight rotations (avg score = 1)
Participation: Describe the assessment process and who participated. Please list the roles each
person played. This section is meant to keep track of program participation from faculty,
committees, deans, and Institutional research etc.
Faculty, Curriculum Committee, Assessment Committee and Dean selected assignment; 4th year
preceptors evaluated students; 4th year program director collected data; Assessment
Committee analyzed data
Example: Section IV.1 Results
Assessment 1 Name: Please state the name of the chosen assignment, survey, exam, etc.
4th year preceptor evaluation
Assessment 1 Goal (Section III.1):
90% of students will demonstrate critical thinking skills in all eight rotations (avg score = 1)
Analytical Approach:
Frequency distribution
Sample Size:
N=20
Statistical Results: Present the statistical results in a figure or table that aligns with the goal.
Frequency
Percent
No
7
35.0%
Yes
13
65.0%
Total
20
100.0%
Example: Section V.1 Discussion & Implications
Assessment 1 Name: Please state the name of the chosen assignment, survey, exam, etc.
4th year preceptor evaluation
Assessment 1 Goal (Section III.1):
90% of students will demonstrate critical thinking skills
Discussion-Was the goal reached? (Yes or no; if no, why):
No; Only 65% of students demonstrated critical thinking skills.
Discussion-How do the results relate back to the PLO: How are students performing (refer to
results) in relation to the PLO? What do the results mean? What were the limitations?
65% of the students were able to demonstrate critical thinking skills in the clinic. Since this
data is collected during their 4th year, it seems clear the program is not reaching the PLO.
Although the results determine the program is not meeting the goal, the program is limited
with data. The ability to determine who these students are is not present at the moment.
Implications-How are the results being used? Please describe what changes are being made
or if things will remain the same in regards to the PLO being assessed. Who were the results
discussed with or have they been circulated? Is there an action plan for closing the loop?
Please describe.
The program is determining 1. If preceptors know what to look for when evaluating students,
2. If there are predictors to student success for this assignment, 3. If previous 4th year
evaluations lead to a different conclusion, 4. If the assessment is rigorous.
You can see a lot by just looking
---Yogi Berra
Student
CT Score
Gender
Student
CT Score
Gender
1
1
2
11
1
1
2
1
2
12
.5
1
3
1
1
13
1
2
4
.5
1
14
1
2
5
.75
1
15
1
2
6
1
1
16
.375
1
7
1
1
17
.5
1
8
1
2
18
.375
1
9
1
1
19
1
2
10
.5
1
20
1
2
CT Score: 0 = no, 1 =yes
Gender: 1 = male, 2 =female
GROUP ACTIVITY
Timeline
TIMELINE FOR PROGRAMS
Section I: Progress Report (draft)
Section II: Institutional Learning Outcome & Program Learning
Outcome Alignment (draft)
Section III: Methodology, Assessment Goals, & Participation (draft)
May 9, 2014
Section IV: Results (draft)
June 6, 2014
FINAL Assessment Report Due
July 31, 2014
TIMELINE FOR REVIEW
Assessment Committee Review of Reports
Aug, 2014
Distribution of Feedback
Oct, 2014
Meetings of Understanding
Dec,2014-Jan, 2015
Report to Provost
Feb, 2015
Deans’ Council Presentation
March, 2015
CAPE Workshops
Spring 2014
• Measurable Student Learning Outcomes
– Tuesday, January 14 at 12pm
• Curricular Mapping
– Tuesday, February 11 at 12pm
• Operationalizing and assessing WesternU ILOs
– Tuesday, March 4 at 12pm
• Developing Valid and Reliable Rubrics
– Tuesday, April 8 at 12pm
• Basic Techniques in Presenting Data
– Tuesday, May 6 at 12pm
• Closing the Loop
– Tuesday June 10 at 12pm
Questions?
Concerns?
Institutional Learning Outcomes
Assessment information can be
found on the IRE website:
http://www.westernu.edu/ireassessment-home
Download