Annual Assessment Report to the College 2010-11

advertisement
Annual Assessment Report to the College 2010-11
College: __CSBS__________________________
Department: __Anthropology______________
Program: __Undergraduate, Graduate________
Note: Please submit your report to (1) the director of academic assessment, (2) your department chair or program coordinator and (3) the
Associate Dean of your College by September 30, 2011. You may submit a separate report for each program which conducted assessment
activities.
Liaison: __Hélène Rougier________________
1. Overview of Annual Assessment Project(s)
1a. Assessment Process Overview: Provide a brief overview of the intended plan to assess the program this year. Is assessment under the
oversight of one person or a committee?
Anthropology is very actively engaged in assessment: there is evidence in the following report of relevant data collection and analysis,
discussion of results, and collaboration across the faculty on how to solve problems identified by this year's assessment work. Unique to this
year is our dual focus on SLO measurement and the timely assessment of emerging concerns, a combination which allows for both continuity
and flexibility. Items 2A of this report will highlight measurement of undergraduate SLO 7, items 2B highlight graduate-level SLO 8
measurement, and item 5 provides information on our work on assessing emergent concerns: student performance in “non traditionally”
scheduled course blocks, i.e., Friday and Saturday blocks, and in online and hybrid courses.
Although, with permission from the Office of Assessment Director, we postponed the creation and implementation of a new assessment
plan until new faculty were on board (see Abridged version of our Program Assessment Plan for 2011-2016), in 2010-11 we nevertheless
undertook three critical projects that were highly relevant to student learning in our department: (1) getting to the bottom of why our seniors
are entering final seminars with poor writing and research readiness, (2) scaffolding thesis preparation for graduate students, and (3) ensuring
quality of course blocks and online learning.
Investigation of the first and second of these critical projects pertains to the assessment of undergraduate SLO 7 and graduate SLO 8 on
Research Skills (see items 2A and 2B), while the third (on “non-traditionally” scheduled course blocks and online/hybrid courses, see item 5)
July 18, 2011, Bonnie Paller
pertains more to our need as a department to ensure that our instruction is effective in these newer forms of delivery -- a concern that we share
with the University-wide HOT/HOP Assessment Project.
In 2010-11, the Assessment Liaison facilitated work on assessment in the department in consultation with (and with support of) the chair,
who has ultimate oversight, and with input from all department members; as of 2011-12, an Assessment Committee has been appointed to
assist with Assessment activities and to support the Assessment Liaison.
Assessment is a systematic item on the agenda of the monthly faculty meetings and the Liaison is asked to report on the AALC meetings, the
planning and progress of assessment activities; faculty contributions to assessment are discussed.
1b. Implementation and Modifications: Did the actual assessment process deviate from what was intended? If so, please describe any
modification to your assessment process and why it occurred.
Some of the outcomes of our Assessment project on graduate SLO 8 (see item 2B below) are still pending since the results of our efforts to
scaffolding thesis preparation for graduate students will only be available after graduation of the first cohort that took the newly implemented
Anth 696 courses (see the 2009-10 Assessment Report). In the mean time, we have gathered direct evidence of students learning before
implementation of the new 696 courses through the compilation of data on all of the students who entered our graduate program since 2000
(semester they entered the program; semester they graduated – if applicable; Track – General Anthropology or Public Archeology; if enrolled in
General Anthro track, if they wrote a thesis or took the comps – if applicable). While waiting for the data to formally assess the efficacy of our
intervention, we have also gathered indirect appraisals from faculty and students, which indicate that the intervention is making a difference
(see item 2B-e of graduate SLO 8 assessment below).
2A. Student Learning Outcome Assessment Project: Answer questions according to the individual SLOs assessed this year. If you assessed more
than one SLO, please duplicate this chart for each one as needed.
2A-a. Which Student Learning Outcome was assessed this year?
Undergraduate SLO 7: Demonstrate the ability to collect, describe, analyze and interpret anthropological data according to generally
accepted professional anthropological practice (Research Skills).
2A-b. What assessment instrument(s) were used to gather evidence about this SLO?
Rubric used in Anth 490 capstone seminars (see attached document).
July 18, 2011, Bonnie Paller
2A-c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type
of students, which courses, how decisions were made to include certain participants.
Three sections of Anth 490 offered in AY 2010-11 were selected as to represent the three sub-fields of Anthropology taught in the
department: one in archaeology (490A: 8 students), one in biological anthropology (490B: 15 students), and one in cultural anthropology (490C:
13 students).
2A-d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different points) or
was a cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.
The rubric was used to assess the final papers of the undergraduate students enrolled in the selected Anth 490 capstone courses.
Following last year's scrutiny of students performance in 490 seminars and faculty’s subjective judgment that students were poorly prepared
to undertake the research required in the seminars (see 2009-10 Assessment Report), we developed the rubric this year in order to identify
which component(s) of Research Skills our undergraduates master and which ones they lack. Faculty were asked to comment the overall rubric
scores in their classes.
2A-e. Assessment Results & Analysis of this SLO: Provide a summary of how the evidence was analyzed and highlight important findings from
the collected evidence.
The rubric scores suggest students are not proficient in critical/analytic thinking. Students also scored poorly on “Use of Theory” and
“Integration of Theory and Data.” Students also had trouble writing thesis statements and developing an argument. Overall, student writing
mechanics were not up to par (faculty blamed this on students’ failure to write multiple drafts of the papers).
2A-f. Use of Assessment Results of this SLO: Think about all the different ways the resulting evidence was or will be used to improve academic
quality. For example, to recommend changes to course content/topics covered, course sequence, addition/deletion of courses in program,
student support services, revisions to program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc.
Please provide a clear and detailed description of how the assessment results were or will be used.
- At the course level, a faculty commenting on the papers turned in by students as being drafts instead of final papers, and interpreting
students' performance as being linked to a lack of time for finalizing the essays, suggested to overcome this problem by assigning a grade for the
draft (that she asks for in advance and on which she gives feedback) so that students would take it more seriously.
- At the program level, another faculty underlined the difficulty of (i.e. lack of time for) teaching thesis statements in the class, and
suggested addressing more thoroughly the skills involved in articulating and developing an argument at an earlier point in the program.
July 18, 2011, Bonnie Paller
This prompted us to update our undergraduate course alignment matrices (adding in new courses that had been developed, and courses
whose objectives had been modified, since 2005) and we will use this preparatory step to revising undergraduate SLOs in the coming year. As
part of the revision process, we will also re-name SLOs as Program Learning Outcomes.
- A faculty also suggested assessing oral skills among our students, not only written work, after noticing evidence of analytical thinking in
oral progress reports of the students' research.
This is one of the tracks we will follow and discuss as we review our SLOs and design assessment tools for them in the coming year. In
studying the writing/research readiness issues in 490, we realized that there is actually nothing in the current SLOs that directly addresses
academic writing and communication, particularly academic writing and communication that makes use of theory and data. Undergraduate SLO
7 gets closest but emphasizes Research Skills, which is too narrow.
- Finally, the trial rubric used for assessing the 490 seminars was originally designed for a 490A section. One of the other faculty mentioned
the difficulty to transpose her observations into this framework as some categories were not appropriate for the type of research students
developed in her class. This underlines again the necessity of revising our SLOs with input from the faculty in the different anthropology subfields (see item 3 below).
2B. Student Learning Outcome Assessment Project: Answer questions according to the individual SLOs assessed this year. If you assessed more
than one SLO, please duplicate this chart for each one as needed.
2B-a. Which Student Learning Outcome was assessed this year?
Graduate SLO 8: Demonstrate the ability to collect, describe, analyze and interpret anthropological data according to generally accepted
professional anthropological practice (Research Skills).
2B-b. What assessment instrument(s) were used to gather evidence about this SLO?
Faculty-evaluated colloquium; survey of students and faculty.
2B-c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type
of students, which courses, how decisions were made to include certain participants.
Students enrolled in the first section of Anth 696B offered in Spring 2011 (n = 16).
July 18, 2011, Bonnie Paller
2B-d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different points) or
was a cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.
A new Anth 696 sequence was implemented in Fall 2010 and a cohorted seminar series was offered starting with 696A (Anthropological
Research Design) in Fall 2010, followed by 696B (Proposal and Grant Writing) in Spring 2011.
This year's assessment involved measuring the efficacy of the changes made to our graduate courses, especially in terms of the research
skills learnt by our graduate students. This was evaluated through the final project for Anth 696B that consisted in students' oral presentation of
their thesis proposal in a faculty-evaluated colloquium. All of the full-time faculty took part to the colloquium.
Besides, the faculty who taught the first sections of Anth 696A and 696B were asked to list and briefly explain positive and negative aspects
of the new 696A/B courses (if applicable), and what should be changed, if anything, so that they would be more helpful for the students.
Students were asked i) how helpful the 696 courses had been to mature their thesis project, ii) how helpful the 696 courses had been to
organize/schedule their research in order to finish their project in a timely manner, iii) if they could mention and briefly explain an aspect of the
696 courses they really liked and an aspect of the 696 courses they did not like, and iv) what should be changed, if anything, in the 696 courses
so that they are more helpful for students to complete their project.
2B-e. Assessment Results & Analysis of this SLO: Provide a summary of how the evidence was analyzed and highlight important findings from
the collected evidence.
- The view of faculty is that students in the first cohort are already making more timely, effective progress than did students in the old
format (working independently), as students in the old format could end the semester without completing the key requirements (an annotated
research bibliography in Anth 696A and a thesis proposal in Anth 696B), whereas they cannot pass the new seminars without completing these
requirements. Discussions among faculty following the colloquium led to acknowledge that we all agreed that students seem to be more
focused and more on track to complete their research (and therefore, their degrees). Following our observations, students who took the survey
noted in general that both 696 courses had been helpful for the maturation of their thesis project and for the organization/scheduling of their
research work (more than 70% even found the courses 'very helpful' or 'extremely helpful' for the maturation of their project).
- Faculty also noted that the sequence of courses promotes a strong sense of group identity and community within cohorts of graduate
students, which may not have been in place before. They were joined by students who mentioned, in their answers to the survey, the benefits
on their research of the discussions with their peers, especially with students focusing on different sub-fields and hence bringing up diverse
perspectives, and the feedback and advice they got from their peers and professors.
- Questions were raised by both faculty who taught 696 regarding the grading of the course, and more especially about i) the expectations
faculty across sub-fields have for students' work (e.g., what constitutes a good bibliography, a good proposal,…) and ii) the grading format of the
courses, which might be better treated as collaborative workshops, and hence graded as C/NC. A faculty underlined the amount of work
July 18, 2011, Bonnie Paller
involved for both the students and faculty for a 2-unit course. Several students suggested to change the format of the course to a hybrid as they
did not think necessary to meet in class every week.
2B-f. Use of Assessment Results of this SLO: Think about all the different ways the resulting evidence was or will be used to improve academic
quality. For example, to recommend changes to course content/topics covered, course sequence, addition/deletion of courses in program,
student support services, revisions to program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc.
Please provide a clear and detailed description of how the assessment results were or will be used.
- The feedback from students and faculty about the new 696 course sequence suggests that we are moving in the right direction to help our
students acquire excellent research skills. Unfortunately, we will not be able to definitely conclude on the improvements of our curriculum until
we get the results of the direct assessment on our students progress when they graduate (see item 1b above).
- The suggestions of a change in course grading and format of the new 696 classes are on the agenda of a faculty meeting this Fall in order to
discuss them collegially with all faculty members.
- As for our undergraduate program, we realized, in studying the research readiness issues in 696, that although academic writing and
communication are essential aspects of the graduate curriculum across anthropology sub-fields, there is actually nothing in the current graduate
SLOs that directly addresses them. Graduate SLO 8 gets closest but emphasizes Research Skills, which is too narrow. This underlines the
necessity of revising our graduate SLOs in the coming year with input from the faculty in the different sub-fields (see item 3 below). As part of
the revision process, we will also re-name SLOs as Program Learning Outcomes. As a preparatory step, we already updated our graduate course
alignment matrices (adding in new courses that had been developed, and courses whose objectives had been modified, since 2005). We plan on
making the 696's faculty-evaluated colloquium a more formal part of the assessment of a revised SLO that includes academic writing and
communication competency as well as a benchmark for progress in a planned longitudinal assessment of the newly scaffolded 696A-B sequence.
3. How do this year’s assessment activities connect with your program’s strategic plan and/or 5-yr assessment plan?
As we reached the end of our 5-yr assessment plan, this year's assessment involved the last SLOs of our undergraduate and graduate
programs and we investigated student learning issues pertaining to research skills at both levels.
We successfully finished the Department's first 3-yr strategic plan and, like all departments in our college, we are just now in the process of
creating new plans. Part of the last 3-yr plan was to make hires based on assessment evidence, which we achieved.
As proposed in the Abridged version of our new Program Assessment Plan for 2011-2016, and with permission from the Office of
July 18, 2011, Bonnie Paller
Assessment Director, we anticipate to make revisions to our undergraduate and graduate SLOs and to come up with better measurable SLOs
during AY 2011-12 that will include the new directions taken by the department since the hire of 2 biological anthropologists in Fall 2008 and
Spring 2009, and especially as three newly hired faculty and one returning administrator have joined us in Fall 2011.
4. Overall, if this year’s program assessment evidence indicates that new resources are needed in order to improve and support student
learning, please discuss here.
One of this year's assessment activities results points to the need to support a greater emphasis on academic writing and communication.
This can be tackled through money for teaching assistants and lower class sizes in courses that have intensive expectations for academic writing
and communication -- so that faculty could spend more time working with students on drafts, revisions, and presentation competency.
Another beneficial resource relates to flexibility in the requirement to schedule classes in ways that support the learning of students, based
on what the online/block scheduling study revealed (see item 5 below).
5. Other information, assessment or reflective activities not captured above.
While we mostly attended to SLOs in our assessment activities this year, we also acknowledge that we need to be flexible and ready to
address emerging concerns with assessment techniques. Effects of online and block scheduling, as well as readiness of students to write, do
research and communicate academically when at the advanced undergraduate and graduate levels, are all critically-relevant to us as a
department. While we are pleased that assessment techniques do attend to SLOs, we also seek flexibility to reverse the flow of analysis when
needed. It is in this context that we investigated student performance in “non-traditionally” scheduled course blocks, i.e., Friday and Saturday
blocks, and in online and hybrid courses:
Although students have been scheduled into 3-hour block courses 4-7 and 7-10 on weekdays for many years, faculty were concerned that
the newly initiated 3 hour Friday/Saturday blocks were problematic in terms of student attendance and performance. Faculty also wished to
evaluate student learning in online and hybrid formats. To this end, data were collected on lower and upper division courses offered in two or
more formats during the last 5 semesters (Spring 2009 to Spring 2011). Information was only collected from faculty who had taught different
formats of the same course within this period of time. Data were provided by 8 faculty members on a total of 41 sections. Data collected were:
• enrollment and cap,
• distribution of students per academic level (to investigate a possible link between the percent of freshmen in a section and the general
performance in that section compared to sections with a higher number of juniors and seniors),
July 18, 2011, Bonnie Paller
• average attendance,
• number of WUs,
• average final grade,
• faculty impressions of student performance in the different formats of the courses and their opinion of the reasons for any differences they
noted in student performance.
Results and observations are as follows:
• No trend was found regarding enrollment nor regarding the distribution of students per academic level.
• Attendance was significantly different in traditional M/W-T/Th courses when compared with F/S block courses, with F/S classes relatively
poorly attended. Only 20% of the Friday/Saturday block classes had an average attendance of more than 80%, whereas 63% of M/W-T/Th
sections had average attendance of 80%.
• Grades. Somewhat surprisingly, the poor attendance in F/S classes did not systematically translate into lower average final grades (possibly
because of curving of the final grades). While the average grades might have been similar in most sections, several faculty noticed more
students at the low end of the grade distribution in F/S classes. One faculty member noted almost a one letter grade average lower for their F/S
section, and clearly linked it to a difference in attendance and participation.
• The general impression of faculty is that students in F/S classes are less committed and less interested.
• OL and hybrid courses: Results were more variable. Students' performance was slightly less good in OL classes than in their M/W-T/Th
counterparts for one upper-division course, which the faculty member linked to a greater dropout rate in OL classes than in brick-and-mortar
sections.
• In contrast, in a lower division course, students' performance in the OL sections was one whole letter grade higher than in M/W-T/Th
sections, which the faculty thinks reflects the levels of achievement by students in the classes. The faculty invoked 3 advantages of OL classes to
explain the difference in students' performance: 1) having no time constraints, more material can be presented; 2) discussions are better
because the forum allows (requires) everyone to participate; 3) the weekly quizzes used in the online course help the students to ensure that
they control the material.
• In the hybrid sections of another course assessed, students performed slightly better than in the traditional M/W-T/Th sections of the
course. The faculty noticed that students in the hybrid classes initially performed poorer than their peers in the M/W-T/Th classes, but the
hybrid format allowed the teacher to quickly apply corrective procedures that proved effective to help students keep up with the course
(spending more time summarizing material from previous week’s lecture and tying it into current week’s material).
It is critical to note that the faculty all agreed that OL and hybrid courses involve much more preparation time and student interaction time
than in-person classes (most commented they spend twice as much time). They also mentioned the necessity of immediate responses, notably
to students' emails about problems with online material, in order to keep them up to date in the course.
July 18, 2011, Bonnie Paller
6. Has someone in your program completed, submitted or published a manuscript which uses or describes assessment activities in your
program? Please provide citation or discuss.
Not this year. We however are interested in collaborating with those across the university who are also interested in hybrid/online
technology/pedagogy (HOT/HOP) assessment and in academic writing/communication and would be happy to share our data with others
working on these issues.
July 18, 2011, Bonnie Paller
Trial Rubric for Evaluation of Anth 490 Seminar Papers
100 points total
1. Statement of Problem (20% of total)
Clarity of statement
Understanding of the literature relevant to the problem
Use of the literature from seminar readings
Originality
2. Strength of Research (30% of total)
Literature search
Use of literature (understanding and evaluation)
Data collection (if relevant)
3. Integration of Theory and Data (20% of total)
Synthesis of data from different sources and/or use of original data
Use of theory to enhance data analysis
Use of data to evaluate theory
4. Presentation (30% of total)
Clarity of writing
Grammar and spelling
Organization of writing
Use of graphics and/or tables
Bibliographic formatting (Use American Anthropologist or American Antiquity format).
Download