Faculty Senate Assessment Committee
Facilitator: Katherine Cermak
Associate Dean for Planning & Assessment
April 2016
Participants will be able to:
Use basic techniques to summarize assessment data.
Quantitative
Qualitative
Compare assessment data to expectations of student knowledge or ability.
Present assessment data as information to support decision-making.
1) Select learning outcome(s) to be assessed.
2) Locate demonstrations of outcome(s) and collect student work products.
5) Determine (if appropriate) actions for program improvement
(and also the assessment activities)
4) Share and discuss results internally.
3) Analyze student work and determine to what extent students are meeting expectations.
Effectively summarize and present results
•
•
•
•
•
Outcome: Students will develop, organize, and communicate information.
Poor work— not acceptable
Developing--
Approaching
Expectations
Proficient--
Meeting
Expectations
Exceeding
Expectations
Appropriate use of sources
Integrated review of the literature
Well-reasoned choice of methodologies
Appropriate analysis
Correct interpretation of results n=50
2
5
10
2
2
8
10
10
3
8
25
28
33
25
25
20
12
7
10
5
Outcome: develop, organize, and communicate information within the discipline
Appropriate use of sources
Integrated review of the literature
Well-reasoned choice of methodologies
Appropriate Analysis
Poor work—not acceptable
Developing--
Approaching
Expectations
Proficient --
Meeting
Expectations
2
2
2
3
8
8
25
28
33
Correct interpretation of results
5
10
10
10
25
25
Exceeding
Expectations
Meeting or
Exceeding
Expectations
20
12
7
10
5
45/50
40/50
40/50
35/50
30/50
Outcome: develop, organize, and communicate information within the discipline
Appropriate use of sources
Integrated review of the literature
Well-reasoned choice of methodologies
Appropriate Analysis
Correct interpretation of results
Overall Outcome
Average
3.3
3
2.9
2.8
2.5
2.8
Poor=1/Developing=2/Proficient=3/Exceeding=4
Standard Deviation
0.75
0.76
0.68
0.88
0.93
.99
Outcome:
Communicate information
Poor work—not acceptable
Developing
--
Approaching
Expectations
Proficient --
Meeting
Expectations
Exceeding
Expectations
Meeting or
Exceeding
Expectations
Appropriate use of sources
Integrated review of the literature
Well-reasoned choice of methodologies
Appropriate Analysis
4%
4%
4%
6%
16%
16%
50%
56%
66%
40%
24%
14%
90%
80%
80%
10% 20% 50% 20% 70%
Correct interpretation of results n=50
20% 20% 50% 10% 60%
The expectation was that 80% of students would meet expectations and less than 5% of students would be in the poor category. . . .
Students can solve problems using scientific processes.
Counts and Bar Chart
Question 16 (Correct: C)
Question 19 (Correct: B)
Question 25 (Correct: A)
Learning Outcome 5
A
243
B
118
241
668
568
18
C
548
45
265
D
65
120
43
•
•
Student can solve problems using scientific reasoning
Percentages
Expectation: 65% would select the correct answer
Question 16 (Correct: C)
Question 19 (Correct: B)
Question 25 (Correct: A)
Learning Outcome 5 (N=974)
A B
25%
25%
67%
12%
58%
2%
C
56%
5%
27%
D
7%
12%
4%
Overall 60% chose the correct answer. Per item 56% - 67%.
Performance expectations were not met.
Question 16
To choose C students must . . . .
A was the most common distractor because. . . .
B was most likely chosen because . . . .
Reporting Quick/Exploratoy Analysis
Describe your process
#of documents/comments/participants
Analysis Method
Findings
Useful Phrases:
The main issues discussed/mentioned were . . .,
The prevailing factor/theme was . . .
XYZ was a common theme raised . . .
Less useful phrases (but sometimes necessary):
A small number . . .
One respondent . . .
Coding
Identify your expectations/biases
Read through all documents
Read through again
Identify themes
Code —by discrete inputs (each comment/each line).
Examine your “other” or “misc” category for additional themes.
Examine text that has no code for missing themes
Tabulate
Prevalence (does not equal importance)
Interpret
1) Creating the rubric and norming with samples of student work was fascinating—we are now all on the same page.
2) Overall, grading is faster and students have indicated that they appreciate seeing what I look for when I grade their work.
9) We assess/grade as a group and its much faster now that we’re using the rubric compared to before.
10) Providing the grading rubric ahead of time does not make the assignment less rigorous. It just makes it clear just how hard the assignment actually is, and cuts down on time spent justifying grades to students.
11) Grading goes much faster, now that I’ve targeted what I’m looking for.
12) I disagree with the other faculty on what is important—process vs product—and this has created difficulties when we group grade that the rubric creation/norming process just exacerbates.
13) I thought the norming exercise was fun. Being able to see where we all agree and discuss our differences and figure out how to revise the rubric so that we could all use it similarly.
Setting
Expectations
Efficient
Grading
Consensus
Building
Self
Assessment
Maintaining
Rigor
Divisive
1
1
1
1
1
1
1
1
1
5 4 3 3 2 2
Other
Consensus
Building?
Justifies
Grades
Setting expecations?
Tally Code Interpretation
5 The sharing of the rubric clarified expectations; especially to students, but also among faculty members.
Rubrics—positive
(setting expectations)
Rubrics—positive
(grading)
Rubrics—positive
(consensus building)
Rubrics—positive
(planning/selfassessment)
Rubrics—positive
(rigor maintained)
4
3
3
2
Faster, more efficient grading.
Creating and norming together created more consensus around expectations/grading
Some students used the rubric to improve their performance.
Sharing the rubrics didn’t result in the assignments becoming less rigorous.
Quotes
“…”
“…”
“…”
“…”
“…”
Rubrics—negative
(lack of consensus)
2 Lack of consensus on the criteria themselves and relative importance—divisive.
“…”
Qualitative Data Analysis Software Descriptions/Cost
http://www.eval.org/p/cm/ld/fid=81
Google Forms/Spreadsheets (Teaching Technology)
Assistance
Dr. Julie Zhu, Deputy Director for Instructional
Design and Technology Integration
Suskie, Linda. (2009). Assessing Student Learning: A
Common Sense Guide. San Francisco, CA: Jossey-Bass.
Using Assessment Results
https://manoa.hawaii.edu/assessment/workshops/index.htm
Faculty Senate Assessment Committee Members
Engineering & Weapons:
Dr. Steve Graham and Dr. Deborah Mechtel
Humanities & Social Sciences:
Dr. Michelle Allen-Emerson and Dr. Silvia Peart
Math & Science:
Dr. Nick Frigo and Dr. Shirley Lin
Professional Development:
LT C. Hirsch, LT C. Roncketti (incoming)
Leadership Education & Development:
CDR Joe McInerney, CDR Lon Olson (incoming)
Office of the Academic Dean & Provost
Dr. Katherine Cermak
Website: www.usna.edu/Academics/Academic-Dean/Assessment/
Assessment Resources
One-on-One consultations with departments, faculty, and staff
Yard-wide assessment events