Central Washington University Assessment of Student Learning 2012 to 2013 Anthropology Department and Program Report Please enter the appropriate information concerning your student learning assessment activities for this year. Academic Year of Report: _2012to 2013______ College: _Sciences_________ Department _Anthropology___ Program: __ B.S., B.A, Anth Minor, Museum Studies Minor___ Table of Contents 1. What student learning outcomes were assessed this year, and why? ................................................................. 2 Programmatic Goals............................................................................................................................................ 2 General Education Related Goal ......................................................................................................................... 3 2. How were they assessed? .................................................................................................................................... 3 Programmatic Goals............................................................................................................................................ 3 A) What methods were used? ......................................................................................................................... 3 B) Who was assessed? .................................................................................................................................... 3 C) When was it assessed? ............................................................................................................................... 3 3. What was learned? .............................................................................................................................................. 4 Programmatic Goals............................................................................................................................................ 4 4. What will the department or program do as a result of that information? .......................................................... 5 Programmatic Goals............................................................................................................................................ 5 5. What did the department or program do in response to last year’s assessment information? ............................ 6 Programmatic Goals............................................................................................................................................ 6 6. Questions or suggestions concerning Assessment of Student Learning at Central Washington University: ..... 6 Table of Tables Table 1. 2011-2012 Assessed Student Learning Outcomes for Anthropology Minors and Majors. ...................... 2 Table 2. Paired Student Results for Anthropology Program Content/Knowledge Exam. ..................................... 4 Table 3. Anth 120, Introduction to Anthropology, General Education course and Anthropology major core requirement, sample sizes n=51 (n=27 native students and n=24 transfer students). .. Error! Bookmark not defined. Table 4. Anth 301, Principles and Assessment of Anthropology, Anthropology entry to major core requirement, sample size n=17 (n=10 native students, n=7 transfer students). ............Error! Bookmark not defined. Table 5. Anth 458, Senior Survey, Anthropology major capstone requirement, sample size n=26 (n=13 native students, n=13 transfer students). ............................................................Error! Bookmark not defined. 1 1. What student learning outcomes were assessed this year, and why? In answering this question, please identify the specific student learning outcomes you assessed this year, reasons for assessing these outcomes, with the outcomes written in clear, measurable terms, and note how the outcomes are linked to department, college and university mission and goals. Programmatic Goals The department assessed four Content/Knowledge student learning outcomes (LOs) from our Programmatic Goals and Assessment plan. We chose these four LOs to pilot our revised Anthropology Majors/Minors content/knowledge exam as an assessment tool for the content/knowledge LOs. The revised exam was adapted from the department’s previous compressive exam after it was found the old questions were not assessing well the Anthropology Program’s content/knowledge LOs. The entire Anthropology Department faculty wrote revised exam questions two years ago to revise the exam. Last year we gave the exam in our entry to the major courses (Anth 301) and our major/minor capstone courses (Anth 458). Table 1 contains which program, college, and university goals are related to the four Content/Knowledge student learning outcomes. Appendix A (last page) contains a matrix of the department’s goals and how they are related to the college and university goals. These SLO’s support the CWU core value of inclusiveness and the COTS mission to create and apply knowledge to solving human, social and environmental problems. Table 1. 2011-2012 Assessed Student Learning Outcomes for Anthropology Minors and Majors. Student Learning Outcomes Anthropology students should be able to: (Content/knowledge, CK 1) Demonstrate understanding of concepts basic to the anthropological study of humans, such as culture, cultural variability, evolutionary change, ethnocentrism, cultural relativism, and holism. (Content/knowledge, CK 2) Identify the goals, main subject areas and distinctive characteristics of each subfield. *Related Progra m Goal Goals 1, 2, 3 *Related College Goal Goal 1 *Related Universit y Goal Goal 6 Method of Assessment Who/What Assessed When Assessed Criterion of Achievement Content/ Knowledge Exam questions Majors/Min ors Entry (Anth 301) Exit (Anth 458/459) 95% of students will pass exam at 75% minimum Goals 1, 2, 3 Goal 1 Goal 1 Anth 107, 110, 120,130, 180 Content/ Knowledge Exam questions Entry (Anth 301) Exit (Anth 458/459) 95% of students will pass exam at 75% minimum (Content/knowledge CK 3) Demonstrate knowledge and understanding of the range and variety of human bio-cultural and environmental adaptations past and present Goals 1, 2, 3 Goal 7 Goals 1, 6 Content/ Knowledge Exam questions Subfield Course @3xx – 4xx Exams Papers? Majors/min ors Course work in Anth 458/459 Presentatio ns Content/ Knowledge Exam questions B.S. B.A. Entry (Anth 301) Exit (Anth 458/459) Course work Anth 458/459 95% of students will pass exam at 75% minimum. portfolio – score @ _____on rubric 2 Student Learning Outcomes Anthropology students should be able to: (Content/knowledge, CK 4) Demonstrate knowledge and understanding of the symbolic and communicative capacities of humans expressed through culture. *Related Progra m Goal Goals 1, 2, 3 *Related College Goal Goal 7 *Related Universit y Goal Goals 1, 7 Method of Assessment Who/What Assessed When Assessed Criterion of Achievement Content/ Knowledge Exam questions B.S. B.A. Entry (Anth 301) Exit (Anth 458/459) 100% of students will meet 3 xx and 4xx. 95% of students will pass exam at 75% minimum. portfolio – score @ _____on rubric Ling , Cultural Arch. Subfield courses @ 3xx – 4xx * See Appendix A for Goals for Department/Program of Anthropology, College of the Sciences, and Central Washington University. General Education Related Goal The department did not assess the General Education Goal (GERG) for Information Technology assigned by the AVP for undergraduate studies. No instructions or rubrics were provided by that office as they were for previous GERG assessments. 2. How were they assessed? In answering these questions, please concisely describe the specific methods used in assessing student learning. Please also specify the population assessed, when the assessment took place, and the standard of mastery (criterion) against which you will compare your assessment results. If appropriate, please list survey or questionnaire response rate from total population. Programmatic Goals A) What methods were used? The anthropology content/knowledge exam was given to Anthropology majors/minor over the last year as it has been for the previous 2 years. The exam is a multiple choice question test that consists of 60 questions. The numbers of questions vary across each of the content/knowledge learner outcomes. Our standard of mastery is that we expect each Anthropology major/minor will pass the exam in Anth 458 answering at least 75% of the questions correctly. B) Who was assessed? Two courses Anth 301 and Anth 458 are required for all majors and minors in the Anthropology Program and the exam was administered to all students taking these courses. This year 114 students had taken the exam in both courses since sampling began in Fall 2011. This allows paired results, which provide the highest resolution view of an individual student’s performance and perhaps an indication of the Anthropology Program’s effectiveness for these specific LOs and that particular student. C) When was it assessed? The exam was administered during regular class time in Winter, Spring, and Fall Quarters, 2011 and Winter and Spring Quarters, 2012 and this year in Fall 2012, Winter and Spring 2013. Graduating Student Exit Questionnaire Students graduating from the Anthropology Department complete an exit questionnaire. Their responses to some questions were considered in our assessment this year. 3 3. What was learned? In answering this question, please report results in specific qualitative or quantitative terms, with the results linked to the outcomes you assessed, and compared to the standard of mastery (criterion) you noted above. Please also include a concise interpretation or analysis of the results. Programmatic Goals Table 2 contains the number of correct and incorrect answers across content/knowledge LOs for the 114 paired students. For all content/knowledge LOs the number of students giving correct answers increased from Anth 301 to Anth 458. While some students are meeting our standard of mastery (at least 75% correct), others are not. As a group of paired results, they do not meet the department’s standard of mastery. Table 3 shows the results of the paired comparison from 2011-2012. In comparison with Table 2, students are performing better on the exams. This could be a result of the larger sample size. There are a number of variables that could be affecting the exam results. How far apart the exam is given to a student, 1 year or more, could impact the change in the scores, assuming that more time to take Anthropology classes the more absorption of the content/knowledge material. Change in the two exam scores would be affected by how little time there was between each exam taking. As more paired results become available (we continue to give the test) we can track the effects of this directly. Another variable might be the questions themselves. It might be useful to look at the questions themselves and see how students are performing. With large samples, we could look at the question itself and determine the distribution of student success at answering particular questions correctly. That kind of information could be used to evaluate the questions themselves. Table 2. 2012-2013 Paired Student Results for Anthropology Program Content/Knowledge Exam. CK1 Less than 75% 75% or higher 301 Exam results Count Percent 28 25 86 75 458 Exam Results Count Percent 17 15 97 85 CK2 Less than 75% 75% or higher 38 76 33 67 20 94 18 82 Less than 75% 75% or higher 38 73 33 67 22 92 19 81 Less than 75% 75% or higher 34 80 30 70 15 99 13 87 CK3 CK4 Table 3. 2011-2012 Paired Student Results for Anthropology Program Content/Knowledge Exam. CK1 Less than 75% 75% or higher 301 Exam results Count Percent 10 50 10 50 458 Exam Results Count Percent 3 15 17 85 CK2 Less than 75% 75% or higher 11 9 55 45 6 14 30 70 Less than 75% 75% or higher 9 11 45 55 9 11 45 55 Less than 75% 11 55 3 15 CK3 CK4 4 75% or higher 9 45 17 85 This exam will be automated this year so we can do an item by item analysis and assess the exam itself. Graduating Student Exit Questionnaire The following data are from the Anthropology Graduate Exit Questionnaire. Graduating students are asked to rate on a scale of 1-5 (1= unsatisfied & 5= extremely satisfied) the following questions. The mean response appears after the question. These are the results of 27 responses unless otherwise indicated. 1.) Quality of Instruction in the Anthropology Department: 4.6 2.) Quality of Anthropology Facilities: 4.8 3.) Range of Courses: 4.1 The following questions are in regard to: Please rate how the following experiences taught you the value of anthropology as part of your preparation for life 1= extremely unhelpful & 5= extremely helpful). 4.) Experience as Teaching Assistant (Applicable to 9 students): 4.1 5.) Experience with Faculty Mentor (Applicable to 24 students): 4.5 6.) Individual Study Experience (Applicable to 17 students): 4.5 7.) Access/Quality of Advising (Applicable to 25 students): 4.2 8.) Experience of Earning Anthropology Degree at CWU: 4.5 The following refers to rate how your anthropology education and experience here at CWU have contributed to your preparation for life in the early 21st C. Having examined other cultures, I can better understand my own. 1=disagree & 5= strongly agree 9.) Ability to Understand Own Culture: 4.6 The following are the number of students participating in various experiences with number of students and in () percent of respondees: Study Abroad: 1 (.04%) Field School: 21 (78%) Internship: 7 (26%) Faculty mentored presentation, poster, or publication: 12 (45%) 4. What will the department or program do as a result of that information? In answering this question, please note specific changes to your program as they affect student learning, and as they are related to results from the assessment process. If no changes are planned, please describe why no changes are needed. In addition, how will the department report the results and changes to internal and external constituents (e.g., advisory groups, newsletters, forums, etc.). Programmatic Goals The content/knowledge exam will be automated on a blackboard website. The questions will be coded by content/knowledge LOs. This will allow us to automate the comparison students and generate distribution data on the answers students are choosing. That information will be used in our next assessment cycle to further calibrate our exam. We do not yet think that these exam results can be used reliably to make changes to program curriculum. This year we are participating in a CSS project that will allow us to automate our exam. Please see the attached proposal (Appendix B) for the project description. 5 We intend to assess student responses to ethics issues in the next year. 5. What did the department or program do in response to last year’s assessment information? In answering this question, please describe any changes that have been made to improve student learning based on previous assessment results. Please also discuss any changes you have made to your assessment plan or assessment methods. Programmatic Goals We continue to refine the program’s assessment plan to better provide context as to whether our students are achieving the LOs. Additional modes of assessment like class scores and portfolio assessment will help us further identify areas for attention. We had hoped to be moving into assessing Information Technology and Literacy this academic year but have yet to receive any guidance for that. See "2011-2012 Assessment of Student Learning Report/Feedback for the Department of Anthropology, Degree Award BS & BA & BA (small plan)" attached. 6. Questions or suggestions concerning Assessment of Student Learning at Central Washington University: Some suggestions: More lead time on the assessment targets. Information Technology and Literacy is to be assessed this year and we have received no guidance yet. Workload units provided for the Assessment committee in the department if it is to continue collecting data at these rates. 6 Appendix A: Goals for Department/Program of Anthropology, College of the Sciences, and Central Washington University Anthropology Goals College of the Sciences Goals University Goals 1. Provide opportunities for students to develop an holistic understanding of human issues through the theories and methods of cultural, linguistic, archaeological and biological anthropology 2. Foster student learning in small-scale and individualized settings with respect for different points of view 1. Provide for an outstanding academic and student experience in the College of the Sciences 1. Maintain and strengthen an outstanding academic and student life on the Ellensburg campus 2. Provide for an outstanding academic and student life in college programs and courses at the university centers 2. Provide for an outstanding academic and student life at the university centers 3. Provide opportunities for student-faculty interaction in research, field work, and teaching assistant experience. 4. Build interdisciplinary relationships that express the multidisciplinary scope of anthropology 3. Provide for outstanding graduate programs that meet focused regional needs and achieve academic excellence 4. Develop a diversified funding base to support curriculum and academic facilities, student and faculty research and scholarships, as well as faculty development, service and applied research in college disciplines. 5. Build partnerships that support academic program quality and student experiences in the college of the sciences, including those with private, professional, academic, government and community-based organizations. 6. Strengthen the college’s contributions to the field of education. 3. Strengthen and further diversity our funding base and strengthen infrastructure to support academic and student programs. 4. Build mutually beneficial partnerships with the public sector, industry, professional groups, institutions, and the communities surrounding our campuses. 5. Engage in research and scholarly activities and share results with peers, students and the public 6. Offer public services that link research and teaching with the needs and interests of local, state and regional communities. 7. Raise awareness of ethical conduct in crosscultural and intercultural interactions for program, campus and community 5. Achieve regional and national prominence for the university. 6. Build inclusive and diverse campus communities that promote intellectual inquiry and encourage civility, mutual respect, and cooperation. 7. Create and sustain productive, civil, and pleasant learning environments. 8. Recruit and retain highly qualified faculty to maintain and develop program currency, stability and effectiveness in all four sub-fields 9. Students will demonstrate knowledge, skills and values relevant to student learning outcomes. 7 Appendix B Computer Science Senior Project Proposal Contact Information Name: ?? Department/Affiliation: ?????? Email: Phone Number: ?? ???? Description of Project The anthropology department has a comprehensive assessment exam that is given to their majors twice, when they enter the program and at the end of their program. The questions on the exam are coded for content knowledge of anthropology subdiscipline area and they are coded for specific student learning outcomes tied to the department's assessment plan. So, each exam can be scored in two ways, one for subdiscipline content and one for learner outcomes. The problem is that we cannot easily calculate these scores each way. Further the time it takes to do a question by question analysis of the exam to determine what is driving the exam scores is time prohibitive. If we had a software solution for our test it would greatly facilitate our assessment efforts and provide readily accessible feedback to the department. Working closely last year with Blackboard people it was determined that software/architecture cannot do what we need. So, for now, we are stuck with hand cranking the results twice for each of our majors and have not been able to do a question by question analysis. A software program that would output two scores, one for subdisciplines and one for learner outcomes would be useful and if it also could show us how the students answer on each question the department would gain a strategic advantage over other similar peer institutions when it comes to curriculum development, assessment, and feedback loops. Part of the software output should be graphical and numerical summaries of the results by students pairs (e.g., entry to major and exit to major) as these could easily show differences between individuals/groups/cohorts performance and provide an analytical means to evaluate our program. Questionnaire 1. Purpose. Does software exist that solves this or a similar problem? ☐Yes XNo ☐ Not Sure If “Yes”, please list what the software is (This can help to understand the project or as a starting point) If possible, explain why existing software is not appropriate for your use. BlackBoard’s testing programs give you student scores for the test, but not how the students did on particular questions or at different times of taking the test. 2. Scope. Is the project large enough to keep a team of 4 students busy for 4-8 hours/week/person for two terms? ☐ Definitely ☐Probably XNot Sure ☐No 8 3. Timeline. Does the project need to be completed before the beginning of March (i.e. end of Winter term)? ☐Yes XNo 4. Level of Risk. Student projects can fail. How important is having a fully functional piece of software completed? ☐Very important ☐ Somewhat important ☐Not important XNo worse off without it Please email this document to Dr. John Anvik (janvik@cwu.edu). 9 APPENDIX C CENTRAL WASHINGTON UNIVERSITY 2011-2012 Assessment of Student Learning Report Feedback for the Department of: Anthropology Degree Award: BS & B.A. Program: Anthropology 1. What student learning outcomes were assessed this year, and why? Guidelines for Assessing a Program’s Reporting of Student Learning Outcomes (Target = 2) Program Value Demonstrated Characteristics Score 4 Outcomes are written in clear, measurable terms and include knowledge, skills, and attitudes. All outcomes are linked to department, college and 2 university mission and goals. 3 Outcomes are written in clear, measurable terms and include knowledge, skills, and attitudes. Some outcomes are linked to department, college and university mission and goals. 2 Outcomes are written in clear, measurable terms and include knowledge, skills, or attitudes. Outcomes may be linked to department, college and university mission and goals. 1 Some outcomes may be written as general, broad, or abstract statements. Outcomes include knowledge, skills, or attitudes. Outcomes may be linked to department, college and university mission and goals. 0 Outcomes are not identified. Comments: No attitudes assessed. 10 2. How were they assessed? a. What methods were used? b. Who was assessed? c. When was it assessed? Guidelines for Assessing a Program's Reporting of Assessment Methods (Target = 2) Program Value Demonstrated Characteristics Score 4 A variety of methods, both direct and indirect are used for assessing each outcome. Reporting of assessment method includes population assessed, 3 number assessed, and when applicable, survey response rate. Each method has a clear standard of mastery (criterion) against which results will be assessed 3 Some outcomes may be assessed using a single method, which may be either direct or indirect. All assessment methods are described in terms of population assessed, number assessed, and when applicable, survey response rate. Each method has a clear standard of mastery (criterion) against which results will be assessed. 2 Some outcomes may be assessed using a single method, which may be either direct or indirect. All assessment methods are described in terms of population assessed, number assessed, and when applicable, survey response rate. Some methods may have a clear standard of mastery (criterion) against which results will be assessed. 1 Each outcome is assessed using a single method, which may be either direct or indirect. Some assessment methods may be described in terms of population assessed, number assessed, and when applicable, survey response rate. Some methods may have a clear standard of mastery (criterion) against which results will be assessed. 0 Assessment methods are nonexistent, not reported, or include grades, student/faculty ratios, program evaluations, or other “non-measures” of actual student performance or satisfaction. Comments: No indirect assessment methods apparent. 11 3. What was learned (assessment results)? Guidelines for Assessing a Program’s Reporting of Assessment Results (Target = 2) Program Value Demonstrated Characteristics Score 4 Results are presented in specific quantitative and/or qualitative terms. 4 Results are explicitly linked to outcomes and compared to the established standard of mastery. Reporting of results includes interpretation and conclusions about the results. 3 Results are presented in specific quantitative and/or qualitative terms and are explicitly linked to outcomes and compared to the established standard of mastery. 2 Results are presented in specific quantitative and/or qualitative terms, although they may not all be explicitly linked to outcomes and compared to the established standard of mastery. 1 Results are presented in general statements. 0 Results are not reported. Comments: 4. What will the department or program do as a result of that information (feedback/program improvement)? Guidelines for Assessing a Program’s Reporting of Planned Program Improvements (Target = 2) Program Value Demonstrated Characteristics Score 2 Program improvement is related to pedagogical or curricular decisions 2 described in specific terms congruent with assessment results. The department reports the results and changes to internal and/or external constituents. 1 Program improvement is related to pedagogical or curricular decisions described only in global or ambiguous terms, or plans for improvement do not match assessment results. The department may report the results and changes to internal or external constituents. NA Program improvement is not indicated by assessment results. 0 Program improvement is not addressed. Comments: 12 5. How did the department or program make use of the feedback from last year’s assessment? Guidelines for Assessing a Program’s Reporting of Previous Feedback (Target = 2) Program Value Demonstrated Characteristics Score 2 Discussion of feedback indicates that assessment results and feedback 2 from previous assessment reports are being used for long-term curricular and pedagogical decisions. 1 Discussion of feedback indicates that assessment results and feedback from previous assessment reports are acknowledged. NA This is a first year report. 0 There is no discussion of assessment results or feedback from previous assessment reports. Comments: 13