Annual Assessment Report

advertisement
Division of Academic Affairs
Annual Assessment Report
For Undergraduate & Graduate Degree Programs
AY 2008-2009
nnual
Name of Program: Political Science
College:
COAS
Prepared by: Staci Beavers, PSCI Chair & Dept Contact
Date: 5/20/09
Shana Bass, assessment lead and report's author
Department Chair/Program Coordinator
Email Address:
sbeavers@csusm.edu; sbass@csusm.edu
Extension:
Beavers: x4194
Bass: on leave 09-10/
not available via
campus phone
PART A (Required by May 21, 2009 – last day of Spring semester)
1)
Please describe the student learning outcomes you focused on for assessment this year,
the assessment activities you used to measure student learning in these areas, and the
results of your assessments. Please also comment on the significance of your results.
SLO: 6) Demonstrate working knowledge of research methods by applying said methods to
critically analyze political phenomena.
For description of assessment project, please see the narrative by Dr. Bass, pasted
into the document beginning on p. 3.
2)
How did your program utilize any resources provided for assessment this year? Please
attach a budget with specifics.
The budget provided to the Political Science department was utilized for photocopying
of the final survey and codebook ($260.52) and necessities for survey administration,
including nametags for student survey administrators, clipboards for respondents,
and pens ($91.97). Professor Bass also earned a $500 stipend from this budget.
A total of ($883.55) was utilized for our 08-09 assessment.
Stipend for Professor Shana Bass
Photocopying of Survey and Codebook
Clipboards, Nametags, Pens
$ 500.00
$ 260.52 (PSCI)
$ 91.97 (Groom)
$ 31.06 (Bass)
TOTAL SPENT:
$ 883.55
Page 1 of 8
5/24/08
Annual Report on Assessment of Degree Programs, AY 2008-09
3)
As a result of your assessment findings, what changes at either the course- or programlevel are being made and/or proposed in order to improve student learning? Please
articulate how your assessment findings suggest the need for any proposed changes.
In the future, and perhaps during our PEP in 2009-10, the department should consider further
curriculum changes (course-level for PSCI 301 and program-level changes to
expand course offerings in research methodology) which would provide more
extensive opportunities for students to learn, practice, and work with these key
concepts (and others) in order to develop a solid foundation in the fundamentals of
research design and methodology. Further, expanding our course offerings would
not only give students more exposure to the fundamentals of research design, but
it would also provide opportunities for students to learn about and practice a range of
research methodologies. Moreover, if students develop a solid foundation in
research design and methods, they will be better prepared for the general upperdivision curriculum in Political Science and for the capstone Senior Research Seminar.
in political science.
Possible proposals for discussion:
CONTINUE WITH ONE METHODS COURSE: If PSCI 301 remains our
only course, we should strongly consider a greater focus on research
fundamentals (similar to the structure of this course) and application of
fundamentals in the context of one main, professor-guided, collaborative
research project (similar to the Exit Poll Project). This may require us to
sacrifice creating opportunities for students to become proficient in a breadth
of research methodologies, but would allow students to build an important
foundation for PSCI upper division courses and Senior Seminar. We could
further utilize the Senior Seminar as an opportunity for further instruction on
the breadth of research methods in the field.
TWO-COURSE SEQUENCE: For the proposed two-course sequence, the
department may consider adding a lower division Introduction to Political
Science Course that will cover basic social/political science thinking and
fundamentals of research, or creating a 2-course upper division research
methods sequence e.g. PSCI 301A and PSCI 301B. The advantage of
creating a lower division Introduction to Political Science course would be
an additional opportunity to recruit majors. The first of these courses (upper
or lower division) could focus on the fundamentals of political science research
including a focus of assignments and exercises entailing the development of
research questions, hypotheses and variables (similar to the curriculum in
PSCI 301 in Fall 08), how to conduct literature reviews and write annotated
bibliographies, a class-wide collaborative project creating an original dataset
and analyzing it, and an overview of the different areas of the field of political
science with guest lectures/discussions by invited PSCI faculty on their own
Academic Programs/DB
Page 2 of 8
Annual Report on Assessment of Degree Programs, AY 2008-09
research projects. The second course could be a more straightforward
research methods course where students learn about and apply the various
research methods in the field of political science, focus on formulating
original research designs utilizing a variety of qualitative and quantitative
research methods, and/or a semester-long original individual or group research
project applying research fundamentals and research methodology.
PART B
4)
Planning for Assessment in 2009-2010
(Required by May 21, 2009)
Please identify one or two student learning outcomes that your program will focus on for
assessment next year.
The Department will undergo a PEP review in 2009-2010. We have been informed by OAP
that a separate assessment project will not be required of us next year for that reason.
5)
What specific assessment activities will you conduct next year in order to measure
program student learning in these areas
See above.
6)
What new or additional resources/support might your program need in order to
conduct these assessment activities next year? (Please provide specific information
regarding your needs and related costs)
We are not submitting a request for additional funding, though we anticipate incurring
the "normal" PEP expenses next year.
Academic Programs/DB
Page 3 of 8
Annual Report on Assessment of Degree Programs, AY 2008-09
OVERALL DESCRIPTION OF PSCI DEPARTMENT ASSESSMENT ACTIVITIES
PSCI 301, our research methods course, is currently rotated among tenure-track faculty
Shana Bass, Scott Greenwood, and Steve Nichols. As the instructor for the Fall 2008 section of
PSCI 301, Dr. Shana Bass shouldered the responsibility for this project. Based on last year’s
assessment of PSCI 301, where we determined that students were still struggling with the basic
concepts of creating, testing, and understanding relationships between research questions,
hypotheses, and variables, we decided to alter our Fall 2008 PSCI 301 curriculum to allow more
class time, activities, and assignments that focused on solidifying these concepts. Moreover, we
focused the examples and assignments in the course on the theme of the 2008 election and
scholarship on campaigns and elections in general. Though PSCI majors have a range of
interests in many fields of Political Science, the focus on the 2008 election as it was occurring
allowed students to become immersed in the research methods of one particular field in political
science and also to understand more clearly how research hypotheses are tested and original
research is conducted in real time. There were 34 students enrolled in PSCI 301 in Fall 2008.
Three significant activities comprised this year’s assessment project: Curriculum Changes, Exit
Poll Election 2008, and Post-Test/Application. A brief overview, including learning objectives
and the assessment rubric utilized across all steps of the semester’s assessment, is included in
Appendix A.
Activity 1: Curriculum Changes/Examination - Description: First, based on last year’s
assessment findings that students need more practice identifying variables and understanding the
connections between variables, research questions, and hypotheses, Professor Bass made
significant changes to curriculum introducing these research questions, hypotheses, and
variables. The new curriculum was a slow, methodical, step-wise introduction to each these
concepts, with each step including: 1) Initial lecture and explanation of the concept (e.g.
research question formulation), including many examples of different research questions; 2)
class participation exercise asking students to try to come up with possible research questions
(e.g.) on particular topics; 3) take home assignment to develop 3 research questions (e.g.) on
particular topics discussed in class and the readings; 4) next-day in-class class assignment –
students volunteer samples from the homework research questions they developed and the
professor and the class worked together to analyze and understand the research question and
improve the question; and 5) this was followed by a lecture, discussion, and examples on the
next-step concept (e.g. hypothesis formulation), and a similar assignment to revise the original
homework research questions, and then create 3 sample hypotheses working from these research
questions. This continued through hypotheses and variables.
Activity 1: Curriculum Changes/Examination - Results: This combination of professor-led
learning (lecture, examples, discussion), student homework, and student critique of homework
actively engaged the students and allowed students the opportunity to develop and hone research
questions, hypotheses, and variables by revising and then building on previous knowledge and
assignments. Further, the assignment of particular research topics for questions based on class
discussions of issues surrounding elections (and related to class readings and discussions on
these topics, e.g. Midterm Elections, Female Candidates, Voter Turnout, Voter Information
Levels, etc.) helped students to focus their practice and develop a research train of thought
around one or two particular straightforward topics.
Academic Programs/DB
Page 4 of 8
Annual Report on Assessment of Degree Programs, AY 2008-09
The first examination of these concepts took place in early October - after the curriculum
changes but before the Survey Research/Exit Poll project.
On the first test of these concepts (n=34 students), students’ overall scores on the
assessment rubric were high – the mean score was 31.80 out of possible 36 points (12 items were
scored against a 3-point rubric; data are included in Appendix A). Students were asked to define
key concepts, identify key concepts in an example, and then to write their own research
questions, hypotheses, and variables and discuss the relationship.
Students scored high when asked to simply define hypotheses (2.85/3.00) and
independent (2.88/3.00) and dependent variables (2.85/3.00). Only one student out of 34
supplied a failing answer for the definition of independent variables, and only one student
supplied a failing answer for the definition of hypotheses.
Students also performed well when asked to identify the research question (2.85/3.00),
hypothesis (2.79/3.00), and independent (2.59/3.00) and dependent variables (2.59/3.00) in an
example. While students were less able to identify the variables correctly, only 6 out of 34
students reported failing answers for both of the variables.
When asked to write their own research questions and hypotheses, the vast majority of
students again did well (2.62/3.00 for both). However, when students were required to identify
the independent and dependent variables in their made-up research project, students faltered a
bit, with an average of 2.38/3.00 on the dependent variables and 2.03/3.00 on the independent
variables, showing that students are still struggling with these concepts. However, when asked
to describe findings that would answer their research questions and show strength for their
hypothesis, students were competent again, with an average score of 2.61/3.00, showing that
students are beginning to understand the relationships between variables and the fundamentals of
research design.
Overall on the first examination, students performed well on the assessments of research
questions and hypotheses, and were able to understand and identify relationships, but still
struggled a bit with variables. (See Appendix A for data.)
Activity 2: Exit Poll Election 2008 Project Final Paper- Description: The second major activity
of our assessment project this year was the formulation, execution, and analysis of a class-wide
Exit Poll for Election 2008. The project entailed students’ creating and testing hypotheses about
presidential and congressional vote choice. During and outside of class time, with close
mentoring from Professor Bass, 9 groups of students each developed their own hypothesis to test
based on the 2008 presidential election. Each group worked with the dependent variable of
presidential vote choice or congressional vote choice, identified their independent variables (e.g.
gender, party identification, level of political knowledge, ideology, age, image of presidential
candidate, etc.), and then wrote a small set of survey questions (up to 5 questions) to test their
hypotheses. Each hypothesis, identification of variables, and set of survey questions went
through several reviews by the professor as well as several active peer reviews in class. In class,
students were trained in survey research techniques, survey administration, coding and inputting
of survey data, and analysis of survey data using Microsoft Excel. Students practiced
approaching possible respondents in class and took a pilot survey. Copies of the class-wide
survey and the codebook are included in this report (See Appendix B: Exit Poll Survey and
Codebook).
All students’ survey questions were included in ONE class-wide survey, which allowed
students to collect a much larger amount of data (N=886, including 298 CSUSM students) than
Academic Programs/DB
Page 5 of 8
Annual Report on Assessment of Degree Programs, AY 2008-09
would have been possible had students collected data as 9 individual groups. Further, students
would be able to share dependent variable questions (e.g. presidential vote choice) as well as a
few important independent variables (e.g. gender, age, party identification). Each student
administered approximately 25 surveys (some students did more), and each student coded the
surveys they conducted and recorded the data in an Excel spreadsheet. Each group then
compiled the data of all their surveys into one spreadsheet, sent it to the professor on deadline,
and the professor compiled the entire dataset and distributed it to the class for analysis.
Changes/errors were reported to the professor and announced to all students in a daily error
report.
Activity 2: Exit Poll Election 2008 Project Final Paper– Results: The 34 students in PSCI 301
were divided into 9 working groups of 3-4 students for the exit poll project (N=9). Groups
worked together to develop research questions, hypotheses and variables, write a research
design, write a set of survey questions, administer the survey, code, input, and analyze the survey
data, prepare an oral presentation on their findings and discussion, and write a final paper
detailing their entire project, including an Introduction, Literature Review, Research Design,
Results, Discussion, Conclusion, and tables/graphs as appropriate. Copies of the data and the
assessment rubric are included in this report (Appendix A). These final projects were assessed
on the basis of their research question, hypothesis, dependent variable, independent variable,
evidence, survey questions, description of sample, description of results, external validity,
critique, and directions for future research.
The overall assessment scores for the final report on the Exit Poll project were high –
with an average of 34.33 out of a possible 36.00 points. The research questions, hypotheses, and
variables for all groups were at the proficient/superior level (3.00), except for one group’s
research question changed by students in the final project write-up. In addition, all groups’
detailed descriptions of the evidence needed in order to test their hypotheses, survey questions,
and descriptions of the sample were at the proficient/superior level, except for one group
showing weakness in description of evidence, and another group who showed weakness in
description of the sample. Seven of the nine groups showed a proficient/superior description of
their survey results, with two groups with only an adequate description. Further, 7/9 of the
groups showed a proficient/superior discussion/analysis of the results, while 2/9 of the groups
showed an adequate discussion/analysis of the results (average 2.77/3.00). Assessments of the
external validity of the survey were also strong, with 6/9 groups scoring in the highest category
and 3/9 of groups scoring in the middle category (average 2.67/3.00). Finally, except for 2
groups, all scored proficient/superior in discussion of directions for future research (average
2.67/3.00), and all groups save one scored proficient/superior in their critique of their project
(average 2.89/3.00). Moreover, all groups wrote reports that were proficient/superior on the
whole with an average overall group score on the rubric of 34.33 out of a possible 36.00. The
highest score was a 36.00 (2/3 of the groups) and the lowest overall score was 30.00/36.00 (2 out
of 9 groups). Though a few groups showed weakness in different areas, overall the students’
execution of the survey research project was excellent, showing that students had solidified their
knowledge of survey research design and execution. (Please see Appendix A: Exit Poll
Assessment Data)
Activity 3: Post-Test/Application/Research Design – Description: On the final
examination in the class, students were asked to write an original survey research design
Academic Programs/DB
Page 6 of 8
Annual Report on Assessment of Degree Programs, AY 2008-09
that a professional researcher/political scientist could execute. A similar sample question
was included in the study guide. The question on the exam was:
“You are a scholar interested in studying the Presidential Election of 2008. Develop an
ideal research project attempting to explain the votes of STUDENTS across California
for either President, Congress, or a certain proposition in the Presidential Election of
2008. Please write, in essay form, a specific research design for the project above. This
research design must include:
1)
2)
3)
4)
5)
6)
7)
your specific research question(s)
the hypothesis(es) you will test
explanations for your hypotheses
specify your dependent variable and independent variable(s)
identify your methodology
specify your sampling method
discuss what kinds of specific findings would confirm your hypothesis(es) and
what findings would disconfirm your hypothesis(es).
You may design your project to be as general or specific as you’d like – for example, you
can look at students at different types of schools, compare particular groups of students,
etc. Keep your research question clear and your design simple and straightforward.
Keep in mind that for your class project you needed to do what was feasible for a student
project, but for this design, consider that you are a professional researcher and have all
the necessary resources to conduct a significant and viable research study.
Students were evaluated by the same 3-level assessment rubric on the basis of their
responses to each of the 7 requirements above.
This type of question allowed students the opportunity to design a survey research project
with the necessary resources. 33 students took the final examination.
Activity 3: Post-Test/Application/Research Design – Results: Overall, students scored very
high on the assessment of the research design, with an average score of 22.34 out of 24.00 points
possible on the assessment rubric. On the research questions, hypotheses and explanations,
students scored very high with averages of 2.84/3.00, 2.81/3.00 and 2.90/3.00, respectively.
Students also did well identifying dependent (2.78/3.00) and independent variables (2.75/3.00)
and offered excellent explanations of research methodology (2.94/3.00). While still certainly
more than adequate, discussion of sampling methodology was the weakest part of the students’
responses at 2.60/3.00. However, discussions of the findings that would show the strength of
their hypotheses were strong (2.72/3.00), showing that students were able to make connections
between their hypotheses and variables and understood that they were testing the relationship
between variables. While students could have a better understanding of variables, sampling
methodology, and findings, the vast majority of students showed a good understanding of the
fundamental concepts and research design, and an ability to apply these concepts to the creation
of an original research project, a key element of our Student Learning Objective 6. (Please see
Appendix D: Post-Test/Application Assessment Data)
Academic Programs/DB
Page 7 of 8
Annual Report on Assessment of Degree Programs, AY 2008-09
Significance of Overall Results: First, a semester-long focus on fundamental building blocks of
research and research design around a common theme in the field provided students the
opportunity to work with these concepts, to put them into practice, and to apply what they
learned to new projects. Second, while students still have some areas of weakness (variables,
sampling methodology), it appears that these opportunities to practice, actively participate in
research, and apply what they learned all provided students opportunities to develop these
fundamental skills. Third, while it is certainly important that Political Science students be
exposed to the full range of research methodology, it is clear that focusing on the building blocks
of research builds an important foundation for research design and application.
In the context of our two full years of assessment data on PSCI 301, it is apparent that
time, practice, and application of fundamental research concepts provides students with
opportunities to improve their skills conducting research in political science. Students improved
their understanding of connections between data and hypotheses this year, and while variables
remain an area of weakness, the students understanding of even these concepts appears to have
improved over last year’s data.
Academic Programs/DB
Page 8 of 8
Download