PHILOSOPHY-2013 2014 assessment report

advertisement
2013-2014 Annual Program Assessment Report
Please submit report to your department chair or program coordinator, the Associate Dean of your College, and to
james.solomon@csun.edu, director of assessment and program review, by Tuesday, September 30, 2014. You may submit a
separate report for each program which conducted assessment activities.
College: Humanities
Department: Philosophy
Program: Major
Assessment liaison: Kristina Meshelski
1. Overview of Annual Assessment Project(s). Provide a brief overview of this year’s assessment plan and process.
This year we continued gathering data for all SLOs. We had a very fruitful norming session for our rubrics, which led to our
streamlining both rubrics for future use, in a way that doesn't invalidate the past data. But we also had the more ambitious goal of
seriously analyzing the data we have been gathering for the last 4 years, and having some open conversations as a department
about how we might "close the loop" given the data analysis.
Last year and this year we have been simultaneously working on addressing two things we consider to be problems: a recent
drop in the number of our majors and a long standing lack of diversity among our majors. Philosophy as a major is heavily skewed
male, and to a lesser extent white. The racial and gender make-up of our major does not mirror CSUN, and though the racial makeup of our major might be partly explained (if not justified) by the lack of diversity among philosophy majors throughout the US, the
same cannot be said of the gender make-up of our major. We have far fewer female majors than we should expect given the
number of female students at CSUN and the number of female students that major in philosophy in the US at large. Our
department's current efforts to systematically study and address these problems began last year with the formation of an ad hoc
Climate Committee. Those efforts were not discussed in previous assessment reports because they were not narrowly directed at
improving outcomes for those students currently enrolled as majors, so we did not conceive of them as assessment related. This
year however we take a more expansive view of assessment and include them in our annual assessment report.
1
2. Assessment Buy-In. Describe how your chair and faculty were involved in assessment related activities. Did department
meetings include discussion of student learning assessment in a manner that included the department faculty as a whole?
As we have been doing for the last 3-4 years, each faculty member that taught a course required for the major was responsible
for rating the relevant assignments from that course according to the relevant rubric and submitting data to the assessment
liaison. We have achieved almost complete buy-in on this from our full time faculty members. We devoted a large portion of
one department meeting to a discussion of our assessment practices in light of the data we had gathered and begun to analyze
up until Fall 2013. As was agreed upon at that meeting, we invited all full time faculty to a special norming meeting for our
rubrics which was well attended (8 out of 10 including the chair). Our assessment committee (Tim Black as Chair of the
department, Adam Swenson as Associate Chair, and Kristina Meshelski as assessment liaison) had a meeting to discuss proposed
revisions to the rubrics and SLOs. Adam Swenson created a database for the data we have been gathering since 2010, and
created a number of different visualizations of the data to aid the assessment committee in analyzing it. The department's ad
hoc Climate Committee also had multiple meetings in 2013-2014, this committee includes Adam Swenson (Chair), Abel Franco,
Kristina Meshelski, Weimin Sun, and Gregory Velazco y Trianosky.
3. Student Learning Outcome Assessment Project. Answer items a-f for each SLO assessed this year. If you assessed an additional
SLO, copy and paste items a-f below, BEFORE you answer them here, to provide additional reporting space.
3a. Which Student Learning Outcome was measured this year?
SLO 1- Students will develop a critical understanding of the work of central thinkers in the Western philosophical tradition.
3b. Does this learning outcome align with one or more of the university’s Big 5 Competencies? (Delete any which do not apply)
3c. What direct and/or indirect instrument(s) were used to measure this SLO?
History rubric. See Appendix B for the rubric itself, which provides descriptions of the eight items on which scores are given.
3d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different
points) or was a cross-sectional comparison used (Comparing freshmen with seniors)? If so, describe the assessment points used.
2
We used a cross-sectional comparison of the earliest papers written in the 200 level history of philosophy courses (201 and
202) with the last papers written in the 400 level philosophy courses (401 and 402). We looked at only papers written by
philosophy majors in these classes, using data going back to 2010, so some students are undoubtedly represented multiple
times.
3e. Assessment Results & Analysis of this SLO: Provide a summary of how the results were analyzed and highlight findings from the
collected evidence.
We compared the average overall score for the 200 level compared to the 400 level. There was an extremely insignificant
increase. This same small increase was mostly consistent when we broke down the scores for each item on the rubric. We
compared the proportion of students who scored a 3 ('proficient' on our history rubric) or above at the 200 level with the proportion
who were proficient at the 400 level. Across the 200 level courses, at least a small majority of philosophy majors are scoring a 3 or
above for most items on our rubric, meaning the majority of our students entering the major have a relatively good understanding
of historical philosophy by the time they write their first paper. When we compared this proportion with the proportion that scored
'proficient' in the 400 level this proportion stayed roughly the same. In other words it looks like we don't manage to significantly
increase the number of students who demonstrate proficient understanding of central thinkers in the history of philosophy between
the first paper in the 200 level courses and the last paper at the 400 level. That said, the majority of students seem to be graduating
with a proficient or better understanding of central thinkers in the Western tradition. In general though, the data we have is very
messy and so it is hard to draw any conclusion with much certainty.
3f. Use of Assessment Results of this SLO: Describe how assessment results were used to improve student learning. Were
assessment results from previous years or from this year used to make program changes in this reporting year? (Possible changes
include: changes to course content/topics covered, changes to course sequence, additions/deletions of courses in program, changes
in pedagogy, changes to student advisement, changes to student support services, revisions to program SLOs, new or revised
assessment instruments, other academic programmatic changes, and changes to the assessment plan.)
We have decided to make a number of changes to our assessment practices. We are going to take out the "analysis" section
of the history rubric, which will hopefully make it easier to use and increase inter-rater reliability. For 2014-2015 we plan to have a
committee meet and score a representative sample of papers according to this new shorter rubric, rather than ask each instructor to
rate their own papers. Since the 400 level history of philosophy courses are not required for the major, we are considering scoring
first and last papers within each of the 200 level history of philosophy courses. We also plan to begin gathering papers to populate
EAS.
3
Our Climate Committee has found that almost all of our majors are transfer students from community college. That means
they spend less time with us than a non-transfer student might. We think this might be part of the reason we see so little change in
scores between the entry courses and the 400 level courses. For this reason, and because we want to make our major more
available and welcoming to non-transfer students, we think our students would benefit from more lower division courses, especially
courses like Phil 150: Introduction to Philosophical Thought, which include study about central figures in the Western philosophical
tradition.
3a. Which Student Learning Outcome was measured this year?
In addition to collecting history rubric data for SLO1, we collected data from the standard rubric for assessment of SLOs 2-6:
[2] Students will read and comprehend philosophical texts.
[3] Students will respond critically and analytically to philosophical positions, arguments and methodologies, including positions,
arguments and methodologies involved in the investigation of significant issues in epistemology, metaphysics, and value
theory.
[4] Students will defend their own philosophical positions and arguments.
[5] Students will write well-organized philosophical essays in which they clearly articulate philosophical positions and arguments.
[6] Students will write well-organized philosophical essays in which they clearly and effectively present and defend their own
philosophical positions and arguments.
3b. Does this learning outcome align with one or more of the university’s Big 5 Competencies? (Delete any which do not apply)


Critical Thinking
Written Communication
3c. What direct and/or indirect instrument(s) were used to measure this SLO?
Standard departmental rubric. See Appendix D for the rubric itself, which describes 18 items on which scores are given. The first
five, in the ‘Argumentation’ section, align most directly with the Critical Thinking Competency. The remaining items align directly
with the Written Communication competency.
3d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different
points) or was a cross-sectional comparison used (Comparing freshmen with seniors)? If so, describe the assessment points used.
4
We used a cross-sectional comparison of the earliest papers written in the 300 level "gateway" courses required for the major with
the last papers written in the 400 level philosophy courses. We looked at only papers written by philosophy majors in these classes,
using data going back to 2010, so some students are undoubtedly represented multiple times.
3e. Assessment Results & Analysis of this SLO: Provide a summary of how the results were analyzed and highlight findings from the
collected evidence.
We compared the average overall score for the 300 level compared to the 400 level. There was an extremely insignificant
increase. This same small increase was not consistent when we broke down the scores for each item on the rubric. But there seems
to be no clear moral to be drawn from which items on the rubric showed an increase and which did not – the data is very messy. We
compared the proportion of students who scored a 3 ('accomplished' on our standard rubric) or above at the 300 level with the
proportion who were accomplished at the 400 level. Across the 300 level courses, for many of the items on the rubric, most of the
majors are scoring a 3, meaning the majority of our students entering the major already demonstrate achievement of these SLOs.
When we compared this proportion with the proportion that scored 'proficient' in the 400 level this proportion went down on many
individual items on the rubric. See the following figures for a comparison of the entry (300 level) classes to senior (400 level) classes
on the "Argument" section of the standard rubric. Satisfactory = 3 or above, Unsatisfactory is below 3.
5
6
In this case the situation appears to be worse than it was for SLO 1, not only are we not increasing the number of students
who are accomplished with respect to SLO 2-6, but we are somehow graduating classes with a lower proportion of accomplished
students than they had when they came in to the major. We think it is possible, given the nature of the students who tend to be
attracted to the major, that our students on average have high critical thinking abilities when they enter the major. (It seems that
students perceive philosophy as a "hard" major.) But we of course hope that we are not having a negative effect on their critical
thinking abilities. It seems likely that course progression issues might be at fault, i.e. students taking 400 level courses before they
have finished 300 level courses. It also seems likely that there is a lack of inter-rater reliability. For both the 300 level and the 400
level courses, there is are big differences in the score distributions between various years of data, which is also a sign that our data
gathering techniques are at fault.
3f. Use of Assessment Results of this SLO: Describe how assessment results were used to improve student learning. Were
assessment results from previous years or from this year used to make program changes in this reporting year? (Possible changes
include: changes to course content/topics covered, changes to course sequence, additions/deletions of courses in program, changes
in pedagogy, changes to student advisement, changes to student support services, revisions to program SLOs, new or revised
assessment instruments, other academic programmatic changes, and changes to the assessment plan.)
We have decided to make a number of changes to our assessment practices. We are going to take out the "writing" section
of the standard rubric, which will hopefully make it easier to use and increase inter-rater reliability. Our norming session indicated
that that is the part of the rubric we understand the least, and disagree about the most. Furthermore, it isn’t directly related to our
SLOs, so we think we can remove it without losing anything of value. For 2014-2015 we plan to have a committee meet and score a
representative sample of papers according to this new shorter rubric, rather than ask each instructor to rate their own papers. We
also plan to begin gathering papers to populate EAS.
We have made previous efforts to advise students to take 400 level classes after taking 300 level classes, but it is clear our
efforts may not have been sufficient.
Our Climate Committee has found that almost all of our majors are transfer students from community college. That means
they spend less time with us than a non-transfer student might. We think our students would benefit from more lower division
courses, we think this would make our major more accessible and welcoming to freshman students who might want to consider a
major.
3a. Which Student Learning Outcome was measured this year?
7
SLO 7
3b. Does this learning outcome align with one or more of the university’s Big 5 Competencies? (Delete any which do not apply)

Critical Thinking
3c. What direct and/or indirect instrument(s) were used to measure this SLO?
This year we used the final exams of philosophy majors enrolled in Phil 200 and Phil 230 to measure SLO 7. Each instructor was
asked to evaluate the exams according to the SLO itself on a scale of 0-4: 4 = Exemplary; 3 = Accomplished; 2 = Competent; 1 =
Marginal; 0 = Unsatisfactory.
3e. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different
points) or was a cross-sectional comparison used (Comparing freshmen with seniors)? If so, describe the assessment points used.
Every philosophy major is required to take either Phil 200 or Phil 230, these courses focus on teaching the skills described in SLO 7.
Previously, we had assessed SLO 7 longitudinally, using specially designed pre- and post-tests in Phil 230. Giving out these tests took
up instructional time but yielded very little data for each class (because so few majors are enrolled in any given section). Thus the
use of instructional time was not justified. So this year we experimented with assessing SLO 7 in a way that does not involve any
comparison, we merely evaluate each student once at the end of the relevant class. In this case, we believe such data can be
valuable because the skills described in SLO 7 (differentiating an argument’s validity from an argument’s soundness) are so
specialized that very few students possess them before taking the relevant courses. The method can be understood as longitudinal
if we assume our students start at 0 with respect to SLO 7.
3f. Assessment Results & Analysis of this SLO: Provide a summary of how the results were analyzed and highlight findings from the
collected evidence.
After adopting this method for assessing SLO 7 in Fall 2012, we now have data on only less than 20 students overall.
3g. Use of Assessment Results of this SLO: Describe how assessment results were used to improve student learning. Were
assessment results from previous years or from this year used to make program changes in this reporting year? (Possible changes
include: changes to course content/topics covered, changes to course sequence, additions/deletions of courses in program, changes
in pedagogy, changes to student advisement, changes to student support services, revisions to program SLOs, new or revised
assessment instruments, other academic programmatic changes, and changes to the assessment plan.)
8
Previous methods of assessing SLO 7 have been found unsatisfactory and unnecessarily disruptive to the relevant courses. But the
current method now seems unlikely to reach enough majors to generate enough data to analyze. We are discussing evaluating SLO
7 in the same way we evaluate SLOs 2-6.
4. Assessment of Previous Changes: Present documentation that demonstrates how the previous changes in the program resulted in
improved student learning.
The Changes discussed above are just about to be implemented, so we cannot report their results yet.
5. Changes to SLOs? Please attach an updated course alignment matrix if any changes were made. (Refer to the Curriculum Alignment
Matrix Template, http://www.csun.edu/assessment/forms_guides.html.)
6. Assessment Plan: Evaluate the effectiveness of your 5 year assessment plan. How well did it inform and guide your assessment
work this academic year? What process is used to develop/update the 5 year assessment plan? Please attach an updated 5 year
assessment plan for 2013-2018. (Refer to Five Year Planning Template, plan B or C,
http://www.csun.edu/assessment/forms_guides.html.)
At this point we have completed year 4 in our 5 year plan and it is clear to us that there were a number of things about the plan that we can
improved. We stuck to the plan very faithfully (apart from SLO 7) and it has generated a lot of strange data. But we have not yet developed a
new 5 year plan.
7. Has someone in your program completed, submitted or published a manuscript which uses or describes assessment activities in your
program? Please provide citation or discuss. No
8. Other information, assessment or reflective activities or processes not captured above.
We are considering administering a survey to our majors to better understand why they chose philosophy as a major, and whether
there is anything we are doing that is not welcoming to non-white students and/or female students. We want to increase the
diversity of our faculty, increase the diversity of our course offerings, and increase our course availability for freshman who might
consider philosophy as a major.
9
Download