Division of Academic Affairs Annual Assessment Report AY 10-11 Annual Assessment Plan AY 11-12 For Undergraduate & Graduate Degree Programs Program Information Name of Program: Prepared By: Sociology College: CHABBS (Department Chair/Program Coordinator) Linda Shaw Date: 9/12/11 Email Address: lshaw@csusm.edu Extension: 8026 PART A: Annual Assessment Report AY 10-11 Due by May 27, 2011 Please note: Should you need additional time to analyze data gathered in Spring 2011, please send an email requesting an extension to Jennifer Jeffries (jjeffrie@csusm.edu). 1) Please describe: A. the program student learning outcomes you focused on for assessment this year. B. the assessment activities you used to measure student learning in these areas. Jennfier C. Jeffiresto the results of your assessment(s). D. the significance of your results. A. Student Learning Outcomes Assessed The Sociology Department’s 2010-2011 assessment of Student Learning Outcomes (SLOs) for the Sociology major focuses on the SLO that concerns understanding issues of diversity. Specifically, this includes social processes and relations related to race, class, gender, and other consequential social characteristics that shape human experience. As stated in the University Catalog, the SLO for this major related to this issue is: Sociology SLO 1: “Analyze and interpret the diversity of social experience using a sociological perspective, especially as they relate to race, class, gender, age, sexual preference, religion, and nationality.” While issues of diversity are fundamental to sociology and are stressed in numerous courses in the major, this year’s assessment activities centered on selected core courses that the Department has designated, because of their primary focus on issues of diversity and inequality, that build proficiency in achieving this SLO. For the Sociology major, these courses include: Soc. 101-Introduction to Sociology, Soc. 311-Inequality, Soc. 313-Race and Ethnicity, and Soc. 315Gender and Society. B. Assessment Activities Used to Measure Student Learning Team Formation: We formed an Assessment Committee comprised of six faculty members: Kristin Bates, Sharon Elise, Karen Glover, Donna Goyer, Linda Shaw, and Jill Weigt. Design: We assessed SLO 1 using a pre- and post-test design. At the beginning of the course and again at the end, students were presented with the same scenario and three open-ended 1 questions gauging their ability to interpret the scenario using sociological understandings. The purpose of the design was to assess changes that occurred over the course of the semester in student understandings of how race, class, and gender relations may shape human experience. In particular, we hoped to assess changes in students’ abilities to employ sociological concepts, as opposed to individualistic and/or pathologizing explanations, to an understanding of life circumstances involving race, class, and gender diversity. Assessment Tool: As described above, the Committee developed an assessment instrument consisting of a scenario and three questions calling for open-ended responses designed to measure student understandings of the role of diversity, specifically race, class, and gender, in shaping human experience. The scenario described the situation of a person whose circumstances and life chances, from a sociological perspective, were shaped by race, class, and gender relations in society. Additionally, the assessment tool allowed us to collect demographic data on the students that might be useful in our analysis (see Appendix A for the scenario and questions). Data Collection: Members of the Assessment Committee administered the pre- and post-test in a total of 10 sections of the following core courses in the Sociology major: Soc. 101 (3 sections), Soc. 311 (4 sections), Soc. 313 (2 sections), and Soc. 315 (1 section). The assessment instrument was administered using MOODLE to the two online sections included in the assessment (1 section of Soc. 101 and 1 section of Soc. 311). Instructors were not present while the assessment was administered, and they were not shown the assessment instrument in order to avoid, to the extent possible, any tendency for the assessment instrument to influence how the course was taught. Students taking the pre-test were also not told that a post-test would be administered. Data Analysis: The Assessment Committee developed a rubric for scoring that required the evaluator to assess responses to the three questions on a scale of 1-5. Scores at the high end of the scale (5) reflected answers that emphasized sociological theories, using specific language to identify those theories and concepts; scores at the mid-range of the scale (3) reflected answers based on students’ abilities to employ sociological concepts and understandings but without identifying specific concepts or theories learned in the course; those at the lower end of the scale (1) reflected more individualistic and/or pathologizing understandings (see Appendix B for rubric used in scoring student responses). The Assessment Committee met twice to discuss the rubric for scoring pre-test and post-test answers and a third time to practice score and norm their responses. The Committee then broke into two teams each of which read half of the student responses. Each team member read and scored the portion of student responses assigned to their team and then met for a “norming” session to discuss divergent scores. This process helped us to achieve consensus about how we would apply the categories of analysis in a consistent manner as the scoring proceeded. After scoring student answers to the open-ended questions, the Committee met twice to discuss a strategy for analyzing and presenting our findings, for example, which student background characteristics might be most important for understanding levels of student mastery of the SLO and to discuss the significance of the study findings. Findings from this analysis are presented below. 2 C. Results of Assessment Frequencies Three open-ended test questions were developed to measure pre/post test changes in how students’ perceptions reflected sociological understandings at the end of the semester. The three questions measured changes in student application of sociological principals regarding their perceptions of the fundamental problem(s) presented in the test scenario (Question #18), the underlying causes of those problem(s) (Question #19), the best possible solution(s) to the problem(s) (Question #20), and the students’ reasoning behind their proposed solutions (also Question #20). The questionnaire that accompanied the assessment scenario asked for basic demographic information such as the number of units taken during the semester, hours worked per semester, race, gender, etc. The sample size was 180 after cleaning data and removing non-matched pairs. Participants in the study ranged in age from 18-48 years with an average age of 22.22 years. They were taking an average of 14.03 units in the semester in which the scenario was distributed, with the majority (70%) taking the class in question to satisfy a major requirement and 15.6% taking the class for GE credit. Students in the sample were predominately female (64.4% female and 35.6% male). 11.2% were Freshmen, 17.3% were Sophomores, 50.8% were Juniors, and 20.7% were Seniors. 16.1% of the sample were employed full time, and 11.1% cared for dependents. 35% of the students’ fathers’ educational levels was high school or less, and 41.7% of mothers’ educational levels was high school or less. 40.6% were of Hispanic, Latino, or Spanish origin. Only 7.3% of the sample claimed a GPA lower than 2.5. Results Tables 1 and 2 show the t-tests for paired samples for our variables of interest. While we initially were planning to present results for each course individually, the number of cases (n) in each course was not sufficient for Soc 313 and 315 to conduct our analyses in this way. Instead, we chose to present our results by grouping the courses based on lower division (Sociology 101Introduction to Sociology) or upper division (Sociology 311-Inequality, 313-Race and Ethnicity, and 315-Gender and Society) designations. Table 1 presents the findings for the lower division Sociology core course (Sociology 101), and Table 2 presents the findings for the upper division Sociology core courses (Sociology 311, 313, and 315) that emphasize SLO 1. Table 1 here We created a composite score for each participant by adding their scores for all three questions. Table 1 shows that the composite mean for the pre-test questions in Sociology 101 was 1.5083 compared to the post-test composite mean which was 1.6083. The difference in these means was not significant. The pre-test mean for the question, “What is the fundamental problem going on in this scenario?” was 1.5625, while the post-test mean was 1.6625. The difference in these means was also not significant. The pre-test mean for the question, “What are the sources or underlying causes of Luisa’s problems?” was 1.4625, while the post-test mean was 1.5875. The pre-test mean for the final question, “What are the best solutions to these problems? and Why do you think these are the best solutions?” was 1.5000, while the post-test mean was 1.5750. Neither of these questions for Sociology 101 produced significant differences between the preand post-tests. 3 Table 2 here Table 2 shows that the composite mean for the pre-test questions in the upper division sociology core courses was 2.1431 compared to the post-test composite mean which was 2.4396. The difference in these means is statistically significant (p< .000). The pre-test mean for the question, “What is the fundamental problem going on in this scenario?” was 2.1786, while the post-test mean was 2.4536. The difference in these means was also significant (p< .001). The pre-test mean for the question, “What are the sources or underlying causes of Luisa’s problems?” was 2.1732, while the post-test mean was 2.4893. The pre-test mean for the final question, “What are the best solutions to these problems?” and “Why do you think these are the best solutions?” was 2.0761, while the post-test mean was 2.3949. Both of these questions for the upper division Sociology core courses produced significant differences between the pre- and post-test means (both at p< .000). In the upper division sociology courses, whether student learning was measured through a composite score or by examining the individual differences in the scenario questions, there was a significant difference in student learning and understanding of SLO 1 between the beginning and end of the semester. Overall, the assessment results show that for the majority of courses examined, there is a significant difference in students’ understanding and ability to apply sociological understandings related to issues of diversity between the beginning and end of the semester. However, the differences between the pre- and post-test scores were not significant in the lower division course, Sociology 101. But, when we examined the relationship between prior number of sociology courses taken and mean differences in pre- and post-test answers for the sample as a whole, data in Table 3 and 4 show that the prior number of courses taken in the Department has a significant impact on student learning. Table 3 here Table 4 here The mean composite score for those students who have had very little prior sociology is 1.7835, while those with a sociological background have a pretest mean of 2.2901. Table 4 shows that this difference is statistically significant (p< .000), meaning that students with prior sociological background were more likely to score higher on the pre-test than students with no background. In addition, Table 3 shows the mean differences in the post-test composite scores between those who have taken very little sociology, 1.9880, and those who have taken 2 or more sociology classes prior to their current course, 2.6667. Table 4 also shows that this difference is significant (p< .000), meaning that while both groups did increase their understanding, those with prior sociological experience learned more than those just being introduced to the sociological concepts. Given these findings, we decided to run the analysis for the Sociology 101 class separately. However, it turned out that only 1 student in the Sociology 101 course had prior sociological experience (measured as having taken 2 or more sociology classes)–which we believe probably explains the difference in the pre and post-test means in this course (compared to the upper division sociology core courses) and the fact that there was no significant difference between the pre and post-test means in the course. Given these findings, and the fact that we can only assume this might account for the lower means, we looked for a second characteristic that might explain the differences. Tables 5 and 6 show these analyses. 4 Table 5 here Table 6 here Table 5 shows the mean composite score for majors vs. non-majors taking the Sociology 101 course. Note that 31 of the students in the course are non-majors, while 9 are majors. The mean for the pre-test composite score for non-majors is 1.4194, while the pre-test mean for majors is 1.8148. Table 6 shows that this difference is statistically significant (p<= .01) (we can assume equality of variance because the Levene test is significant). This means that even though these students have no prior sociological background, those who have an interest in sociology, as measured by declaring the major, score higher on the pre-test than non-majors. In addition, Table 5 shows the mean differences in the post-test composite scores between non-majors, 1.5054, and majors, 1.9630. Table 6 also shows that this difference is significant (p< .05), meaning that while neither group had much sociological background, those who have declared an interest in sociology showed a stronger understanding of diversity issues from a sociological perspective than non-majors at the end of the semester. D. Significance of Results Overall, tests of significance used in this assessment show student success in mastering the program’s SLO focused on issues of diversity in the core courses studied. In the introductory course (Soc 101), student understanding increased over the course of the semester, though the differences between the pre- and post-test scores were not significant. Even so, when the commitment to studying the discipline was taken into account, students who had declared Sociology as a major demonstrated significantly greater mastery of SLO 1 than non-majors. By contrast, when upper division core courses are considered, differences between pre-and post-test results were statistically significant. This suggests that while the majority of students do not show mastery of SLO 1 in their first semester, students’ ability to analyze and interpret the diversity of social experience using a sociological perspective improves incrementally over time with greater exposure to sociological ways of thinking as they progress through the major. Finally, while these assessment findings demonstrate that the Sociology program is succeeding in achieving its first SLO in the upper division courses studied, we had hoped for an even stronger showing from our students at the end of the semester than this analysis reveals. In reflecting upon these outcomes, the Committee discussed several possible reasons for these results: 1) since approaches and the content of these courses may vary from instructor to instructor, students may have learned a great deal in these courses related to SLO 1 that might not have been captured by our assessment tool; 2) our assessment tool set the bar high, that is, to receive a top score of 5 required rather sophisticated analysis and application of sociological concepts, theories, and perspectives that sometimes are not even seen among graduate students; and 3) the sort of sociological perspective that the program is asking students to master runs counter to dominant and strongly held ideological perspectives related to diversity in society. Strongly rooted in individualistic, rather than sociological explanations for life circumstances, such perspectives and beliefs are very difficult to change, even after several courses focused on these topics. 5 2) As a result of your assessment findings, what changes at either the course- or program-level are being made and/or proposed in order to improve student learning? Please articulate how your assessment findings suggest the need for any proposed changes. Because of excessive workload during the spring 2011 semester due to a tenure track faculty search, as well as the very labor intensive nature of our 2010-2011 assessment, the Assessment Committee required the summer to complete its work. Therefore, in order to disseminate our findings to the faculty, we will hold an all-faculty forum at the beginning of the fall semester that will include both tenure-track and lecturer faculty to share the results of our analysis. We anticipate that this will result in important suggestions for using assessment findings at the program level (for example, possible revision of our SLOs) as well as at the course level (for example, discussing how we want to operationalize SLO #1) as we develop ways to support one another in refining the process of communicating and assisting students to master this important program SLO. 3) If you used the resources that were given to you as stated in your plan, please check here. X If you used them differently, please provide specifics. PART B: Planning for Assessment in 2011-2012 Required by October 3, 2011 1) Describe the proposed PSLO activities for AY 2011-12. (Note that assessing PSLOs can take many forms. Programs may find the attached list of sample assessment activities helpful. Please check the Assessment website for additional resources at www.csusm.edu/assessment/resources/). Since the Sociology Program will be undergoing Program Review during the 2011-2012 AY, no assessment will be conducted. 6 2) What specific assessment activities will you conduct this year in order to measure student achievement of these outcomes? 3) Please describe how the assessment support of $750 will be used. Sample of assessment activities: Development (or review and refinement) of measurable program student learning outcomes. Development of program student learning outcome matrix, showing in what courses each PSLO is assessed, or each PSLO is introduced, reinforced and expected to be mastered. Identification and assessment of one or two program SLOs. Development of rubrics for assessing PSLOs. Development of commonly agreed upon “signature assignments” designed to assess mastery of PSLOs. Faculty assessment retreat. Dissemination and/or discussion of PSLOs with students in the program. Development of course learning outcomes and delineation of their relationship to PSLOs. 7 8