Assessment Tools & Frameworks for Collecting Formative - sdsu-cdi

advertisement
Jim Julius
SDSU Course Design Institute
May 27, 2009
ASSESSMENT TOOLS &
FRAMEWORKS FOR
COLLECTING FORMATIVE FEEDBACK
ON COURSE DESIGN
Guiding Questions
 Why collect formative feedback on course
design?
 How should one decide what kind of
feedback to seek?
 What tools are available to collect feedback?
 What do I do with the data?
What (and why) are you
measuring?
Summative
Assessment of
Student Learning
Formative
Evaluation of
Course Design
Formative
Assessment of
(for) Student
Learning
What (and why) are you
measuring?
 Outcomes: tell you what you got, not how
or why
 Inputs
 Processes
 Seeking continuous improvement
 Approaching course design from an inquiry
mindset
Outcomes






Satisfaction
Retention
Success
Achievement
External proficiencies
Real-world performance
Inputs





Learner characteristics
Context
Design
Learning resources
Faculty development
Processes





Pedagogies
Presentation media
Assignments/assessments
Student use of technologies
Community of Inquiry model
(social, cognitive, teaching presence)
 Interactions
(content, peers, instructor, technology
itself)
Community of Inquiry Model
CoI - Interactions
Narrowing Your Inquiry
 Do you want to evaluate your course according
to “best practices”, i.e. standard course design
quality criteria?
 Do you want to know more about your learners
in general: needs, preferences, motivation,
satisfaction?
 Do you want to focus on student achievement?
 Do you want feedback on your facilitation of
learning?
 Do you want feedback on specific course
elements and/or technologies?
Course Design Quality Criteria
 Chico rubric
 Quality Matters
 Related to Chickering and Gamson’s “7
Principles for Good Practice in Undergraduate
Education”
 From Indiana University, 2001
 From VCU, 2009
 Paid tool: Flashlight
Learning about Learners
Direct
Indirect
 Learning styles surveys
 National and institutional
 Parallel faculty-student
data (aggregate)
 Institutional data (for your
learners)
 LMS data
surveys
 ELI – student and faculty
 SDSU’s LRS faculty and
student surveys, adapted
from LITRE (NC State)
 Distance Education Learning
Environment faculty and
student surveys
Student Achievement
Direct
Indirect
 Low-stakes: muddiest
 Grade data
point, minute papers,
clickers, discussion boards
 Pre- and post- tests
 Attendance/participation
 Outcome comparisons
(Different
technology/pedagogy and
same outcome, or
Same
technology/pedagogy and
different outcomes)
Teacher Behaviors/Overall
Direct
Indirect
 Community of Inquiry
 Observation Protocols





Survey
Small Group Analysis
Mid-semester surveys
End of course evaluations
Assessing online
facilitation
Paid: IDEA survey of
student ratings of
instruction
Course Elements
Direct
Indirect
 Student Assessment of
 Examine usage data from
Learning Gains: SALG
 Clicker opinions survey
Blackboard
Data from M. Laumakis
 pICT fellow in 2005
 Began teaching parallel 500-student sections
of PSYCH 101 in 2006, one traditional and
one hybrid
 First fully online PSYCH 101, Summer 2008
Evaluating the Face-to-Face
Class
 Evaluated Fall 2005 innovations via the Student
Assessment of Learning Gains (SALG)
 How much did the following aspects of the class
help your learning?
 Rated from 1 (no help) to 5 (great help)
Evaluating the Face-to-Face
Class
 What did the data show?
Question
MWF
Section
TTH
Section
ConceptCheck Questions
4.1
4.1
Discussion Boards
2.9
3.1
Evaluation Findings:
IDEA Diagnostic Survey
19
Evaluation Findings:
IDEA Diagnostic Survey
Fall 2006
Blended
Fall 2006
Traditional
Spring
2007
Blended
Spring
2007
Traditional
Progress on
objectives
70
73
77
77
Excellent
teacher
65
68
69
68
Excellent
course
62
72
73
71
Note: Top 10% = 63 or more
20
Evaluation Findings:
Departmental Course Evaluations
21
Evaluation Findings: Course Grades
Fall 2007
Fall 2007 Course Grades
A
3.9
8.9
31
Grade
B
35.8
Blended
33.1
34.6
C
12.1
D
Traditional
15
15
12.8
F
0
10
20
30
40
% in Category
22
Clicker Data: Spring 2007
Question
% Agree or
Strongly
Agree
Class clicker usage makes me more likely to attend class.
93%
Class clicker usage helps me to feel more involved in class.
84%
Class clicker usage makes it more likely for me to respond to
a question from the professor.
91%
I understand why my professor is using clickers in this course.
90%
My professor asks clicker questions which are important to
my learning.
90%
Summer 2008 Fully Online:
SALG Data
 How much did the following aspects of the
class help your learning?
 Rated from 1 (no help) to 5 (great help)
Summer 2008 Fully Online:
SALG Data
Question
Summer 2008
Online
Taking the test online
4.27
Discussion Forums
3.00
Introduction e-mail that explained the
basics of the course
4.50
SALG Data over time
Question
Fall 2007
Blended
Fall 2007
F2F
Spring
2008
Blended
Spring
2008
F2F
Summer
2008
Online
Questions, answers,
and discussions in
class
3.96
4.04
4.10
4.01
4.36
Live online class
sessions
3.39
4.20
4.15
Archives of live online
class sessions
4.15
4.50
4.44
Quality of contact with
the teacher
3.41
3.48
3.94
3.90
4.26
Working with peers
outside of class/online
3.12
3.22
3.31
3.39
3.82
Summer 2008:
Community of Inquiry Survey

Statements rated from 1 (strongly disagree)
to 5 (strongly agree)
Based on the Community of Inquiry
framework’s three elements:

1.
2.
3.
Social Presence
Cognitive Presence
Teaching Presence
Summer 2008:
Community of Inquiry Survey
CoI Dimension
Student Ratings
Social Presence
3.94
3.56
4.29
3.97
3.96
3.91
3.73
4.09
4.10
4.38
4.50
4.38
4.23
Affective Expression
Open Communication
Group Cohesion
Cognitive Presence
Triggering Event
Exploration
Integration
Resolution
Teaching Presence
Design and Organization
Facilitation
Direct Instruction
So, what would you like to
further explore?
Download