CS 42: Crowdsourcing Innovative Practices for Assessing Integrative

advertisement
Crowd-Sourcing Innovative Practices:
Assessing Integrative Learning
at Large Research Institutions
Mo Noonan Bischof
Assistant Vice Provost
mabischof@wisc.edu
Amy Goodburn
Associate Vice Chancellor
agoodburn1@unl.edu
Nancy Mitchell
Director, Undergraduate Education
nmitchell1@unl.edu
LEAP Integrative Learning
Synthesis and advanced accomplishment
across general and specialized studies
demonstrated through the application of
knowledge, skills, and responsibilities to
new settings and complex problems.
Challenge:
Assessing Integrative Learning
• Can/Should the same assessment tools be used
for assessing within a course, a unit, and/or
institution?
• Does it apply to integrating knowledge and skills
within a discipline, among disciplines, or both?
• How can we align quality improvement levels
while respecting disciplinary purposes & values?
UW-Madison Learning Community
21,615 employees…
2,177 faculty
1,635 instructional academic staff
1,261 research academic staff
5,291 graduate assistants
42,820 students …
29,118 undergraduates
9,183 graduate students
2,774 professional students
1,745 Non-degree students
Annually:
7,400 new undergraduates
29,500 enrolled undergraduates
6,500 Bachelor’s degree graduates
13 academic schools/colleges
distributed responsibility and governance
~500 academic programs, all levels
134 Bachelor’s level degree programs
Annual Degrees
More than 300
200-299
100-199
50-99
1-49
Includes WIX and ELO’s
Institutional-level learning goals, assessments
Program-level learning
goals, assessments
Program-level learning
goals, assessments
Program-level learning
goals, assessments
Program-level learning
goals, assessments
Program-level learning
goals, assessments
Why pilot the AAC&U VALUE
Rubrics?
• Identified gap: institutional level assessment, direct
measure approach
• Evaluates student learning across programs
• Aligns with AAC&U Essential Learning Outcomes
• Aligns with VSA/College Portrait demonstration project
• First pilot project summer 2012, second pilot 2013
• Main Goal: bring faculty across disciplines together to
evaluate student work
AAC&U VALUE Rubric Project
Scorers
Rubrics
• AAC&U VALUE written
communication rubric
• Cohort of 25 faculty
• Cross-disciplinary representation
• Focus on faculty engagement
Artifacts
• “Value-added” approach to
compare first year students
and students near graduation
Written Communication VALUE
Rubric
Selected written communication for ease of
identifying artifacts across disciplines/programs
Dimensions:
• Context and Purpose for Writing
• Content Development
• Genre and Disciplinary Convention
• Sources and Evidence
• Control of Syntax and Mechanics
Artifacts: “Value-added” Approach
• Goal was to collect 350 artifacts at each
level, FYR and NGR
• Identified 52 courses that had high
numbers of FYR and NGR and seemed
likely to have a suitable writing assignment
• 22 courses (41 instructors) had a suitable
assignment and agreed
• Invited 2450 students to submit artifacts
• Collected 451 submissions
Scorers: Faculty Engagement
• 1.5 day workshop in June 2013
Scorers
• Set ground rules
• 3 structured rounds intended to get
faculty familiar with the rubric and to
“test” scorer agreement
• Asked faculty to think beyond their
field/discipline
• Each scorer rated about 40 artifacts
• Discussion revealed challenge with
the 4-point scale and what is
“mastery”
Rubrics
Artifacts
Table 1. Overall Results for All Artifact Scores
Rubric
Dimension
Context
Content
Genre
Sources
Syntax
Student
Group
Nearly
Graduating
First Year
Nearly
Graduating
First Year
Nearly
Graduating
First Year
Nearly
Graduating
First Year
Nearly
Graduating
First Year
# of
Artifacts
213
Mean
Std Dev
Zmw Score
2.95
0.95
3.05*
237
213
2.77
2.79
0.96
4.68*
237
211
2.48
2.69
0.88
2.65*
235
190
2.50
2.61
0.99
1.54
225
213
2.50
2.82
0.84
2.16*
237
2.69
*Zmw score is from the Mann Whitney U-Test. Zmw scores >1.96 indicate that the two groups
are significantly different at p=0.05.
Table 1. Distribution of Combined Scores - Written Communication
Rubric
60.0
51.0
Percent of Scores
50.0
44.4
40.0
30.3
27.3
30.0
22.6
17.5
20.0
10.0
3.8
2.6
0.0
1
2
First-Year Students
3
Nearly Graduating Students
4
Summary Findings
• Percent of nearly graduating students who were judged
proficient or better (a score of 3 or 4 on 4 point scale) on
each of the dimensions was fairly high—ranged from 64%83%. Across all dimensions: 74.7%
• Levels of significant difference between first-year and nearly
graduating students were weak
• Inter-scorer reliability was problematic (“mastery” issue…)
– Overall 67% of scorer pairs showed weak agreement or
– Systematic disagreement
What did we learn?
• Importance of assignment (artifact) development
• Adapt rubric: program mix and/or campus culture
(language, LOs)
• Engagement of faculty = high quality discussions
(ground rules/calibration)
• Next Steps: continue to engage faculty at program
and disciplinary levels
Contact Information
Mo Noonan Bischof, Assistant Vice Provost,
University of Wisconsin-Madison, mabischof@wisc.edu
More about our project: http://apir.wisc.edu/valuerubricproject.htm
University of Nebraska-Lincoln
Research One, Big Ten Conference, Land-Grant
24,000 students
8 independent colleges
Achievement-Centered Education
(ACE)
•
•
•
•
10 Student Learning Outcomes (30 credits)
600 courses across 67 departments
Transferable across 8 colleges
Requires assessment of collected student work
•
•
•
•
UNL Assessment Context
Review of each ACE course on 5-year cycle
Biennial review of all undergrad degree programs
50 disciplinary program accreditations
10-year North Central/HLC accreditation
ACE 10
Generate a creative or scholarly product that
requires broad knowledge, appropriate
technical proficiency, information collection,
synthesis, interpretation, presentation, and
reflection.
HLC Quality Initiative: ACE 10 Project
•
•
•
•
•
25 faculty across colleges meet monthly to
Explore methods and tools for assessing work
Develop a community to share ideas
Connect ACE 10 & degree program assessment
Develop process for creating assessment report
Create team of assessment “ambassadors”
Discussing Assessment Practices
A Common Rubric
disciplinary vs. institutional goals
Inquiry Project Results
• Abandoned idea to pilot a common rubric
• Revised syllabus to focus on processes, not tools
• Developed poster session for public sharing
• Streamlined ACE & program review processes
• Creating process for 5-year ACE program review
Group Discussion
• How do you address
differences across
disciplinary norms and
cultures?
• What strategies can you
use to develop shared
goals and understanding?
• How can program/
disciplinary assessments
inform institutional
assessment and vice
versa?
• What are some effective
practices for supporting
and sustaining faculty
and staff engagement?
Download