PSYC 201 - General Psychology Interim Report

advertisement
CSU 1
Interim Report: COPPIN STATE UNIVERSITY (CSU)
SCHOOL OF PROFESSIONAL STUDIES
DEPT. OF APPLIED PSYCHOLOGY AND REHABILITATION COUNSELING (DAPRC)
COURSE REDESIGN: PSYC 201 – GENERAL PSYCHOLOGY
I. Course Description
General Psychology (PSYC 201) is an introductory course that serves as both the gateway course
for the applied psychology major, and as a general education requirement (GER) in the Social
Sciences.
Typically, the department offers 8-10 sections per semester, and enrolls 300-350 students in the
traditional lecture format. Including summer and winter offerings, annual enrollment is ~700-800
students, approximately 24% of the University’s annual enrollment (based on Fall 2010
undergraduate enrollment for undergraduates = 3298).
II. Issues that needed to be resolved (e.g., course drift, too many adjuncts, student
dissatisfaction, etc.)
Course Drift. Course content varies by section in terms of chapters covered, delivery of material,
learning objectives, students and student learning outcomes:
o Variations in the degree to which technology is incorporated into the traditional
classroom (e.g., Tegrity lecture capture system, BlackBoard course management, Clicker
student response systems).
o Not all faculty integrate the online ancillary materials which accompany the textbook;
My Psychology Lab (MPL). Even among faculty who use MPL, its use varies greatly–as
part of lecture (e.g., video clips), optional for students, or required assessments and
activities as graded coursework.
Too many sections taught by adjunct faculty members. Only 2 of the 8 face-to-face sessions
(25%) are currently taught by full-time faculty members. The 2 online sections are currently
taught by a full-time faculty member.
Poor Student Outcomes (DFWs)
o Data from Spring 2011 served as a baseline; for ABC grades were 67%; 13% earned a D.
Thirty three (33%) of students earned grades of D, F, and W. This leads to course
repetition for D (majors) or F (non-majors) grades.
III. Choice of redesign model or models
Replacement Model was selected this course redesign
o Retain the basic structure of the traditional course but replace some lectures with mandated
and required ‘Live Psychology Lab’ hands-on laboratory experiences, as well as self-paced,
asynchronous Computer-based learning activities and assessments (MPL).
o In the Traditional lecture format, contact times are 160 minutes per week. In the
replacement model, this is reduced to 60 min per week, with another 60 min of replacement
hands-on experiential learning laboratories (Live Psych Labs), and ~40 min of self-paced
computer-based learning activities (MPL).
o Live Psychology experiential learning labs will be used to meet institutional student learning
outcome goals. Specific labs will be designed to improve students written and oral
communication skills, digital information literacy, critical thinking/analytical reasoning
skills (e.g., Digital Literacy & Electronic Library research skills)
CSU 2
IV. Description of pilot
a. There were 7 traditional face-to face sections offered; 3 were dedicated for freshmen students (
FYE1,2, 3 to promote community); 2 online sections; and 2 face-to-face sections that were reconfigured
into one redesigned section (n=60 students) taught this past Fall 2012 by Cameron. This pilot version
followed the format proposed with the integration of Team Meetings and ongoing assessments of
challenges.
b. Assessment plan
(1) Contrast DFW rates between the redesign and traditional course sections.
(2) Contrast student mastery of example assessments between the traditional and redesign courses for the
same format (face-to-face) and same Professor (Dr. Cameron) across semesters.
(3) Contrast student comments on faculty course evaluations across pre- and post-redesign sections for
each faculty member.
(4) Examine whether there is a reduction in the number of “repeaters”: i.e., students who have to re-take
General Psychology due to either F grades (General Requirements) or D grades (Psychology majors).
V. Results: anecdotal, qualitative and quantitative
(1) Contrast of DFW rates between the redesign and traditional course sections
DFW Rates: Pilot Course Compared to Fall 2011 & Spring 2012 x Faculty
n= Fresh/Soph ABCs DFWs
Ds
Fs
Historical (all sections, SP2011)- Baseline
Traditional Fall 2011(Cameron)
Pilot Redesign Fall 2012
(Cameron)
Traditional Sp 2012
Ws
252
65
60
na
92%
95%
67%
60%
40%
33%
40%
60%
13%
12%
27%
10%
8%
15%
10%
20%
18%
28
86%
79%
21%
3%
0%
18%
DFW Rates: Pilot Course Compared to Fall 2012 Traditional x Faculty
n=
Fresh/Soph
ABCs DFWs
Ds
Fs
Ws
Section 002
35
62.9
37.1
17.1
14.3
5.7
Section 003**
30
95%
36.7
63.3
30.0
13.3
20.00
Section 004**
30
95%
43.3
56.7
23.3
16.7
16.7
Section 005
32
78.1
21.9
18.8
0
3.1
Section 101
33
69.7
30.3
12.1
3.0
15.2
FYE1
34
100%
50.0
50.0
35.3
14.7
FYE2
39
100%
76.9
23.1
20.5
2.6
FYE 3
35
100%
65.8
34.2
17.1
17.1
Online Section A
25
60.0
40%
8
20
12
Online Section B
12
50.0
50%
41.7
8.3
**Pilot Redesign Fall 2012 (Cameron)
DFW rates varied by course format and faculty, and represented increases over baseline for 5 out of the 8
course sections. Within-faculty DFW rates increased by 20% from Fall 2011 and by 27% from Spring
2011 data. Withdrawal rates varied considerably and were highest in the Redesign sections. This may
indicate decreased college preparedness in our student sample (consistent with faculty impressions) as
CSU 3
well as the presence of nesting effects, and other confounding variables such as GPA, majors, technology
preparedness, faculty grading.
(2) Content Mastery
On a common sample assessment for course content (Biopsychology Ch. 2) revealed that students in the
Redesign had lower scores (M=60, Min=20, Max 100; n=42) than those in a prior Traditional semester
(Spring 2012; M=81, Min=12, Max 100; n=13). While the course redesign used a new text, this finding
supports the need to explore other confounding variables.
(3) Course Evaluations *pending
(4) Repeaters *pending
a. What worked well?
Students who engaged in the ancillary online My Psychology lab work (M=68, Median=75, Min 7,
Max 100) reported enjoying it, as well as the hands-on Live Psychology labs (M=70, Median=72,
Min 19, Max 100). Students also anecdotally reported enjoying Clickers in lectures. Using a median
split procedure, we found that students who engaged more with the online MPL materials had higher
final grades.
100.00
80.00
60.00
40.00
20.00
0.00
Low MPL
High MPL
b. What did not work?
There were no significant positive correlations between MPL coursework or lab work and other
assessments of content mastery (e.g., Exam or Quiz grades).
Overall, there seemed to be low ‘buy in’ among students to engage in the self-directed online
redesign activities. Students postponed purchasing MPL access, failed to complete
assignments, and accepted low scores on assignments, despite an ability to re-take
assessments.
There were significant technical difficulties in the first month of the pilot with the My
Psychology Lab technology (Pearson), which may have dissuaded student engagement.
c. What can be inferred from the course data?
Varied course material, instructors, & assessment methods led to variable DFW outcomes
across course sections.
The Redesign appears to have required greater time management and organizational skills of
students.
Students who engaged in all aspects of the redesigned course, particularly the Live Psychology
lab and MPL assignments outperformed those who focused on only 1-2 aspects.
CSU 4
d. What data should you have collected?
Student Survey beyond the standard Course Evaluation would be useful to assess which aspects
of the course students found most challenging; anecdotally keeping track of multiple assignments
across various technology formats (MPL lab and online activities & BlackBoard Quizzes,
Discussion Boards and/or Journal assignments), and organizing their time to be on track with
deadlines.
It was difficult to compare sample course assessments across semesters due to the change in
Textbook, and therefore assessment materials on Quizzes and Exams. Moving forward, it will be
much easier to contrast scores on the same assessments across faculty and course formats (e.g.,
online vs. face-to-face).
VI. Plan for full implementation
Full implementation of all courses will start in Spring 2013 and this has been captured in the
schedule with each section accommodating up to 60 students. Six (6) sections of the course will
be offered, with 2 of these sections offered online. The online sections will have the same lectures
as captured through Tegrity, and the same content and structure, but will be offered
asynchronously and virtually.
There are 4 lead faculty involved in the course re-design; one Tenure-Track, one full-time, and
two affiliate (adjunct) faculty. This represents a reduction by 50% in the number of full time
faculty teaching the course, for greater consistency in course delivery.
a. What changes will you make to the model, course content, technology, etc.?
Two (2) programmatic issues are in the process of finalization – 1) the CLEP Examination needs
to be installed so that students may take it at the end of the semester; and 2) the location of the
dedicated laboratory space for the ‘Live Psychology Labs’.
A reanalysis of the Assessment methods will take place at this time, with possible development of
a Student Survey to collect more detailed input concerning the students’ perspectives on course
redesign factors.
Improve transparency through use of Technology to provide continuous feedback on student
performance in the course.
Improve Organizational structure and Timeline/Due Dates for all assignments in a streamlined
fashion to assist students with course management and time management skills.
Continued experimentation with Clicker in the classroom; analysis of pilot survey regarding
Clicker implementation in the Pilot.
Are any of the issues areas where Faculty Fellows or USM Academic Affairs can help?
Sharing best practices for providing performance feedback to students, academic and personal
mentoring support, and the use of Early Alert systems.
Sharing processes to delineate confounding variables in course redesign to better assess learning
outcomes.
Additional quantitative and qualitative processes could be put into place to help account for
confounding variables (e.g., nesting effects, student/classroom level variables).
Download