Annual Assessment Report to the College 2010-11

advertisement
Annual Assessment Report to the College 2010-11
College: _____CSBS___________________________
Department: ___Psychology_________________________
Program: ____undergraduate major___________________________
Note: Please submit your report to (1) the director of academic assessment, (2) your department chair or program coordinator and (3) the
Associate Dean of your College by September 30, 2011. You may submit a separate report for each program which conducted assessment
activities.
Liaison: ____ Holli Tonyan ____________________________
1. Overview of Annual Assessment Project(s)
1a. Assessment Process Overview: Provide a brief overview of the intended plan to assess the program this year. Is assessment under the
oversight of one person or a committee?
During 2010-11, Holli Tonyan was the Assessment Liaison. The department developed an assessment committee that met during the Psychology
faculty retreat but subsequently had scheduling difficulties so the retreat was the only meeting. For 2011-12, we have arranged a regular
meeting time for the committee. The department’s existing 5-year plan suggested that in 2010-11 we would assess SLO#7 “Students will
demonstrate knowledge of the theories, concepts, and empirical approaches from diverse perspectives of psychology, including biological
processes, developmental processes, individual and social processes, learning and cognitive processes” via test, writing sample, and
performance using randomly selected samples of student work across courses, services, and internships at the end of fall and spring semester.
1b. Implementation and Modifications: Did the actual assessment process deviate from what was intended? If so, please describe any
modification to your assessment process and why it occurred.
Yes – the actual assessment process did deviate from the planned process. At our department planning retreat in January, 2011, a committee of
faculty set a revised agenda for the year to assess SLO 6 (6. Students will demonstrate appropriate and ethical use of subjects by going through
the process of subject recruitment and debriefing the human subjects involved in psychological research) which had not yet been assessed. We
July 18, 2011, Bonnie Paller
1
organized the implementation of a comparison of students in PSY150 (where the ethics of conducting research with human participants is not
covered) with students in PSY321 (where that topic is directly taught and has usually been assessed). We were able to (1) assign all students at
both levels to complete an online training certification program offered by National Institutes of Health on the ethical treatment of human
participants (phrp.nihtraining.com) and (2) then examine whether students who had taken coursework and had lectures on the ethical
treatment of participants scored better on the same embedded test questions as students who only completed the online module. As described
below, our results showed that students at the 300-level performed better than those at the 150 level.
The original plan for assessing SLO 7 was vague and so we had to establish a method. The department has a history of using predominantly
indirect assessment to inform changes to curriculum and policy. We have not had a history of using similar test questions, assignments, or
rubrics to assess SLO 7 consistently across courses or sections. As a result, we are currently working to revise our overall assessment plan and
received permission from the Office of Assessment to submit that plan at the end of the 2011-12 academic year. Since no mechanisms were in
place when I became Assessment Liaison, I surveyed faculty by email asking which faculty currently assess SLO 7 in their courses. Of the faculty
that responded, the section that I taught last Fall actually included the largest number of students and was one of two sections offered for a
course that fulfills a major requirement (i.e., PSY313). I decided to track students’ progress toward learning outcomes (i.e., specifically SLO 7)
during the course of a semester as a way to pilot a method that could be used across sections: using a series of short answer questions that
address core themes in the field of developmental psychology, the proportion of students who correctly answer questions on different topics
but the same theme.
In addition, although not requested by the assessment committee, a faculty member independently assessed students’ performance on SLO 5
(Students will demonstrate sufficient use of statistical analysis, interpretation, and presentation of psychological data) and the results of that
comparison are included here as well.
2. Student Learning Outcome Assessment Project: Answer questions according to the individual SLOs assessed this year. If you assessed more
than one SLO, please duplicate this chart for each one as needed.
2a. Which Student Learning Outcome was assessed this year?
6. Students will demonstrate appropriate and ethical use of subjects by going through the process of subject recruitment and debriefing
the human subjects involved in psychological research.
July 18, 2011, Bonnie Paller
2
2b. What assessment instrument(s) were used to gather evidence about this SLO?
In the Spring 2010 semester, the instructors of PSY321: Psychological Research Methods (a required course in psychology) had students
complete the Human Participants Protection Education for Research Teams online certification by the National Institutes of Health
(http://phrp.nihtraining.com/users/login.php). All students were required to pass this certification since they are required to conduct research
as part of the course requirements. In addition, in the Fall 2010 semester, all the students in one section of PSY150: Principals of Human
Behavior were also required to take and pass the online certification. The PSY150 section (Fall 2010) and one large section of PSY321 were given
a 10-point quiz (developed by Dr. Plunkett, Professor, Psychology) that assesses fundamental knowledge regarding appropriate and ethical use
of subjects in psychological research.
2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example,
what type of students, which courses, how decisions were made to include certain participants.
Students in all sections of PSY321 in the Spring 2010 semester, and students in one large section of PSY150 were required to pass the online
certification. The 10 questions on ethical behavior were given to a large section of PSY321 (Spring 2011, N = 120) and PSY150 (Fall 2010, N =
121). PSY321 is a gateway course for the psychology major. PSY150 is a general education course, but has many psychology majors since it is a
prerequisite to other psychology courses. By assessing a freshmen level class and a junior level class, a difference in knowledge based on
students’ classification was possible.
2d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different
points) or was a cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.
A cross-section comparison was used to assess level of knowledge of students in a freshmen level class versus students in a junior level class.
2e. Assessment Results & Analysis of this SLO: Provide a summary of how the evidence was analyzed and highlight important findings
from the collected evidence.
First, 100% of the students in the PSY321 and PSY150 completed the online certification (as required in the classes). The results indicated that,
on average, students’ overall scores increased: the students in the junior level class (i.e., PSY321) got 90% of the 10 questions on the ethics quiz
correct, while students in the freshmen level class (i.e., PSY150) got 72% of the 10 questions on the ethics quiz correct. If mastery of the
content was determined by getting at least 7 of the 10 questions correct, then 95% of the students in PSY321 exhibited mastery, while 72% of
the students in PSY150 exhibited mastery.
July 18, 2011, Bonnie Paller
3
The following table shows the number and percent of students in the two different sections who answered questions correctly.
Psy150
Psy460
N
% of 122
N
% of 121
Number of questions answered correctly
1 to 4 5 to 6
7 to 8
9 to 10
Total
27
8
33
54
122
22%
7%
27%
44%
7
18
96
121
6%
15%
79%
2f. Use of Assessment Results of this SLO: Think about all the different ways the resulting evidence was or will be used to improve
academic quality. For example, to recommend changes to course content/topics covered, course sequence, addition/deletion of
courses in program, student support services, revisions to program SLO’s, assessment instruments, academic programmatic changes,
assessment plan changes, etc. Please provide a clear and detailed description of how the assessment results were or will be used.
These assessment results indicate that there is clear distinction between level of knowledge regarding SLO 6 when comparing students in
required 100-level and 300-level courses. Also, in both courses, mastery of SLO 6 was achieved by the majority of students; nonetheless,
students showed a higher level of mastery (as demonstrated by higher scores overall and by a larger proportion of students answering 7 or
more items correctly) at the 300-level when the content was directly taught in the course in addition to the online training module. One
recommendation for the department is to require all professors teaching PSY 321 to have their students complete the online certification by
the NIH. Although these results do provide indication that students are learning about the ethical treatment of participants, we have not yet
directly assessed students’ ability to complete research in an ethical manner – this would be a higher level of mastery for SLO 6 that we would
expect at the capstone (400) level.
2a. Which Student Learning Outcome was assessed this year?
5. Students will demonstrate sufficient use of statistical analysis, interpretation, and presentation of psychological data.
2b. What assessment instrument(s) were used to gather evidence about this SLO?
The instructors of PSY320, a required course in Psychological Statistics, worked directly with the prior Assessment Liaison and agreed on core
content to be taught in the course. The faculty then also created a series of test questions that they give at the beginning and end of each year
July 18, 2011, Bonnie Paller
4
to test student gains of the taught content. The questions are embedded in exams used in the course. There are 20 questions that assess
fundamental knowledge required for the course at baseline as well as material they should learn during the course of the semester.
2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type of
students, which courses, how decisions were made to include certain participants.
Data were gathered in PSY320. This is a gateway course for our major in which the fundamentals of quantitative reasoning and literacy in
psychology are introduced and practiced. These ideas will then be practiced again and used in inquiry projects in many of the capstone courses.
In Fall, 2010 and Spring, 2011, all students in one section each semester were assessed. Although we would like to see a longer trajectory of
development among our students, we do not currently have a way of meaningfully assessing the development of students’ quantitative literacy
into the capstone courses. Currently, students are not required to use quantitative reasoning in all capstones. Past efforts have assessed
advanced quantitative literacy in only those elective courses where advanced statistics are taught, and students who opt for courses that do use
quantitative reasoning represent a biased sample (i.e., only those students who elect to take advanced stats!).
2d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different points) or was
a cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.
Data were longitudinal during a semester: all students in the gateway course were assessed at the beginning and end of the course. We also
have data cross-sectionally since these tests items have been given to students in previous years. However, for the purposes of this report, I
focused on the most recent assessment of student learning outcomes in PSY320.
2e. Assessment Results & Analysis of this SLO: Provide a summary of how the evidence was analyzed and highlight important findings from the
collected evidence.
Data were analyzed by examining the proportion of students who got each item correct and comparing the proportions at intake and exit. The
exact same questions were asked on both exams. There is little risk of practice effects since there is sufficient time delay between pre and posttest. The results suggest that larger proportions of students were able to correctly answer the test questions at the end of the course than at the
beginning. In both semesters, the majority of items showed increases of 10% or greater in the proportion of students correctly answering the
items (14 items increased between 10-45% in Fall; 12 items increased 10-40% in Spring). Based on the results, it is evident that students had
mastered some of the prerequisite content knowledge at intake, therefore only small gains were detected over time (e.g., concepts of median,
mean, mode). Analyses of a few items suggests that some content may need to be covered differently because proportions of students correctly
responding started low and stayed low (e.g., in Fall only one question showed this pattern: standard error; in Spring Percentage Ranks, Standard
Error, and Inferential statistics showed this pattern). These results are currently being used to revise how standard error, in particular, is being
taught. A small number of items showed a negative change over time (fewer students answered that item correctly at the end than at intake),
July 18, 2011, Bonnie Paller
5
but no items maintained that pattern across both semesters.
2f. Use of Assessment Results of this SLO: Think about all the different ways the resulting evidence was or will be used to improve academic
quality. For example, to recommend changes to course content/topics covered, course sequence, addition/deletion of courses in program,
student support services, revisions to program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc.
Please provide a clear and detailed description of how the assessment results were or will be used.
This assessment has helped in a number of ways. It allows our faculty to focus on the content that they have established as central to our major
and our program learning outcomes. It also helps individual faculty members determine how to proceed with course content by giving them a
baseline level of understanding. Finally, it helps individual faculty members determine whether or where they may want to modify the course
content to address persistent confusion or lack of improvement (e.g., standard error as described in 2e).
2a. Which Student Learning Outcome was assessed this year?
7. Students will demonstrate knowledge of the theories, concepts, and empirical approaches from diverse perspectives of psychology,
including biological processes, developmental processes, individual and social processes, learning and cognitive processes.
2b. What assessment instrument(s) were used to gather evidence about this SLO?
One section of PSY313, Developmental Psychology, has made a focus of directly teaching students about themes central to psychology and
repeatedly assessing this SLO via short answer questions on every exam. As a result, although the specific short answer questions used assess
different content areas, they all address the underlying theme in a way that is explicitly discussed with students. In addition, embedded multiple
choice exam questions also address themes via different content areas across the span of a semester. Both short answer and multiple choice
exam questions, then, provide a window into changes in students’ understanding of the theories, concepts, and empirical approaches from
biological, development, individual and social, and cognitive processes. A course on developmental psychology offers a unique insight into
students’ understanding of all aspects of psychology because this specialty within the field traces the development of many different aspects of
psychology.
2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type of
students, which courses, how decisions were made to include certain participants.
All students in one section of PSY313 from Fall 2010 (N = 120) were assessed and their results were compared to all students from one section of
PSY313 taught by the same professor in the previous year (N = 48). PSY313 is a course that fulfills a major requirement.
July 18, 2011, Bonnie Paller
6
2d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different points) or was
a cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used and serves as a prerequisite
for capstone courses so demonstrating that all or most students mastered some of the SLOs in this course will prepare those students for work
at the 400-level in the capstone courses.
The design is both cross-sectional (comparing 2010 with 2009) and longitudinal (within one semester). All assessment points are at the 300level. From each exam, questions were selected based on whether they reflected core themes that cut across areas of psychology. For all crosssectional comparisons, data are presented only for questions that were exactly the same across both years.
One particular comparison deserves note. Students were given unlimited time to complete an online activity that was related to a module in the
course and that was later assessed again as a short answer question on an exam.
2e. Assessment Results & Analysis of this SLO: Provide a summary of how the evidence was analyzed and highlight important findings from the
collected evidence.
Comparing results within Fall 2010:
• One course objective that falls within the SLO is that students will be able to differentiate qualitative and quantitative changes. The
concepts of qualitative and quantitative change are introduced in the first weeks of class and then examples from prenatal
development, brain development in early childhood, brain development during the transition to middle childhood, and the biological
changes during puberty are all specific topics covered during the subsequent weeks of the semester. On Exam 1, students were asked to
list and describe two of each kind of change from prenatal development. 55% of students got full credit, 40% of students got partial
credit, and 5% got no credit. On Exam 4 students were asked to list and describe two of each kind of change from the transition to
middle childhood. 61% of students got full credit and 34% got partial credit (the remainder got no credit). Although the increase in
students receiving full credit is not large, these results suggest that the majority of students were understanding the underlying concept
and were able to apply it across multiple periods of children’s development. One implication for this finding that has been incorporated
into the course for Fall, 2011, is to informally “quiz” students on the general definitions throughout the semester and to assess their
understanding of the basic concept before specific examples are covered to ensure that all students understand the definition and
general examples early in the semester.
• Another course objective that fits within SLO 7 is for students to be able to compare and contrast theoretical explanations for different
developmental phenomena. This was assessed on Exam 2 and again on Exam 3 (for two different phenomena). On Exam 2, only 21% of
students got full credit and 78% of students received partial credit whereas on Exam 3, 50% got full credit and 39% got partial credit.
This suggests that just as we would expect at the 300 level, students are getting practice at using psychological explanations to explain
phenomena and they are beginning to gain mastery over those explanations.
• Although each short answer question assesses a different content area that has specific points that must be covered to receive full
July 18, 2011, Bonnie Paller
7
credit, there is a core set of characteristics of answers that receive full credit across all topics. For partial credit the student must be able
to use at least one term or its definition/idea from the relevant content domain (e.g., state “the law of effect” or explain in some form
that behavior that results in a positive consequence will be used more frequently over time) and for full credit the student must be able
to correctly identify both the terms that signify the causal mechanism relevant for that question AND contrast it with the correct
opposing idea/explanation or explain why their example is a good example of the concept in question. For example, many students can
identify examples of qualitative and quantitative change, but cannot identify what makes it a good example – only students who can cite
the definitions of the two terms and explain how their examples fit the definition get full credit.
Comparing Fall 2009 (N = 48) with Fall 2010 (N = 120):
• Students did relatively well on questions that asked them to recognize the definitions of key terms across both years. For example,
students were able to recognize the definition of plasticity, or the extent to which and the conditions under which change in
developmental outcomes are possible at different points in development (a learning objective for the developmental psychology area of
our major), across both years with over 90% of students correctly identifying the definition of this term in both years (91% in 09 and 95%
in 10).
• Students also did relatively well in matching different explanations for developmental phenomena with different theoretical frameworks
as assess on multiple choice embedded exam questions. For example, approximately 90% of students could correctly distinguish two
explanations for attachment (96% in 09 and 88% in 10). With regard to explanations for language development, understanding of
explanations improved from 2009 (67% correct) to 2010 (86%).
2f. Use of Assessment Results of this SLO: Think about all the different ways the resulting evidence was or will be used to improve academic
quality. For example, to recommend changes to course content/topics covered, course sequence, addition/deletion of courses in program,
student support services, revisions to program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc.
Please provide a clear and detailed description of how the assessment results were or will be used.
Although the results of these efforts to assess student learning suggest that many students are mastering key course concepts, we continue to
seek to better understand how coursework within our major actually impacts students’ learning and how learning within a course fits into the
larger structure of our major. For example, at the level of PSY313, we decided to conduct a pre-test of students’ understanding about the ages at
which children develop different capabilities so that we can then see changes over time in students’ understanding of new areas of content. In
addition, we are looking for creative ways to help students assess their own understanding at multiple points in the semester so that they can
better assess whether they understand core concepts before they need to apply them in an exam setting.
3. How do this year’s assessment activities connect with your program’s strategic plan and/or 5-yr assessment plan?
July 18, 2011, Bonnie Paller
8
SLO 6 and SLO7 were the last SLOs to be assessed from our previous 5-year plan, so we have now assessed each of our SLOs in some form over
the past five years. We are currently in the process of creating a long-term assessment plan. At present assessment activities have neither been
systematic nor related to a regular process of program review, but have instead been conducted on a relatively ad hoc basis. We had a working
group assembled at a retreat during January, 2011, to discuss assessment, but found that we actually need some dedicated time for the entire
department to focus on assessment together.
4. Overall, if this year’s program assessment evidence indicates that new resources are needed in order to improve and support student
learning, please discuss here.
We are a very large department with 25.5 tenure-track faculty and 23 part-time faculty helping teach the Psychology major. We recently exerted
much effort, as a department, to change our major in response to indirect assessments and non-documented informal assessments to increase
breadth across sub-disciplines of psychology and to give students an opportunity to explore a sub-discipline more in depth. We also spent much
effort on disseminating information regarding the new major. However, we have been in a transitional stage such that students who declared
the Psychology major before Fall 2010 could stay with the requirements of the old major unless they declared otherwise. As stated clearly by the
Department’s EPC chair, the “transition period from the old to the new major requirements is soon coming to a close. As part of the transition,
we had agreed not to uphold any new prerequisites until Fall 2012. This was because the prerequisite courses did not even exist prior to Fall
2010, when we moved to the new major. We decided to give all students a two-year window to have a chance to take the new courses before
we enforced the new prereqs. However, as of Fall 2012, ALL prerequisites will be enforced. This affects enrollment in the capstone courses.” (Email correspondence from our Department EPC Chair, Shannon Morgan, dated 9-14-11). These changes will also dramatically affect assessment
efforts and assessment results.
The university is facing budget cuts within each college and department, decreasing expenditures and maximizing faculty load. Conducting
assessment in such context with very little resources (i.e., none) at such a large department with such tremendous recent changes will require
time for organization of efforts, organization of assessment batteries, networking with individual faculty, working with teams of faculty in each
sub-discipline of psychology, and evaluating the use of possible standardized measures for program assessment. Therefore, the department is in
dire need of graduate student assistantships and faculty release time. Many Psychology program assessments are costly and a budget devoted
to assessment efforts in the department (and potentially training and travel for assessment faculty) would enable the department to set the
stage for appropriate direct assessment of student learning outcomes that are meaningful to CSUN students and graduates.
In particular, the current year’s assessment efforts were originally intended to cut across sections taught by multiple instructors, but had to be
scaled back because of the difficulty in coordinated multiple weekly schedules and simply staying on top of communication among the many
people involved. The Department has requested a part-time student employee to facilitate assessment. This is certainly not enough to sustain a
well-coordinated, assessment of all of our majors (N = 2,504, according to a Fall 2011 Dean’s Report), but should at least help with the
July 18, 2011, Bonnie Paller
9
coordination and compilation of data from multiple sections and across multiple instructors, both full- and part-time.
We would love to see additional resources that would help us use pre and post testing in Moodle that could cut across courses. For example, to
assess students understanding of neurons at the beginning of and after Psy150 (our Intro course in Psychology), at the beginning and end of
Psy250, and again in Psy313 (Development) when they will understand neuronal changes in different parts of the brain at different periods of
development. This may help students see how their knowledge builds incrementally over time and some psychological research suggests that
simply testing students repeatedly on certain content will help them master that content as well. However, we need training and technical
resources in order to be able to effectively design and implement such a program. The simplifying assessment program could offer such support,
but we need faculty in the department with release time to spearhead such an effort. As an alternative to having faculty coordinate the efforts,
it would be extremely useful with such a large major to have a student employee whose responsibility it was to coordinate with faculty to ensure
that embedded questions are used across all sections and to compile relevant data from the many different faculty involved.
5. Other information, assessment or reflective activities not captured above.
After expending an enormous amount of energy having our revised curriculum approved, 2010-11 was the first year in which the new major was
implemented. This involved helping students transition to the new major, including offering new courses, changing existing courses, waiving
prerequisites for 400-level courses to accommodate students who were enrolled under the prior major plan for whom required prerequisites
were no longer being offered. The administrative and intellectual work involved in successfully transitioning to the new major was a major focus
of departmental curriculum efforts. We look forward to the 2011-12 year during which we can begin to fully implement the new major for the
first time. This is a time when we will begin to focus more intently on developing common assessment strategies. During 2010-11, we did have
a half-day retreat during which we had a working group focus on assessment, but we found that we need additional time during which a large
group of faculty can focus just on assessment across the new major.
6. Has someone in your program completed, submitted or published a manuscript which uses or describes assessment activities in your
program? Please provide citation or discuss.
July 18, 2011, Bonnie Paller
10
July 18, 2011, Bonnie Paller
11
Download