Annual Assessment Report to the College 2010-11

advertisement
Annual Assessment Report to the College 2010-11
College: ___Science & Mathematics_________
Department: ___Geological Sciences________
Program: ____B.S. and M.S.____________
Note: Please submit your report to (1) the director of academic assessment, (2) your department chair or program coordinator and (3) the
Associate Dean of your College by September 30, 2011. You may submit a separate report for each program which conducted assessment
activities.
Liaison: _______Matthew d’Alessio________
1. Overview of Annual Assessment Project(s)
1a. Assessment Process Overview: Provide a brief overview of the intended plan to assess the program this year. Is assessment under the
oversight of one person or a committee?
Our goal for the year’s assessment was to build a new assessment plan in light of the changes in our department. We have 4 new tenure-track
faculty since 2009 (a third of our department, including two new faculty that started Fall 2011), so we hoped to rethink the vision of the
department.
Our assessment is under the oversight of one person. Our new liaison started his role in Fall 2011.
1b. Implementation and Modifications: Did the actual assessment process deviate from what was intended? If so, please describe any
modification to your assessment process and why it occurred.
During 2010-2011, we did not meet our goal of designing a new assessment plan because of a lack of critical mass of active faculty who could
devote intensive time to this task. During this year, faculty numbered 11, including one FERP faculty. Two faculty searches were conducted in Fall
2010, which occupied a large percentage of faculty time, and three faculty were on FERP or sabbatical in Spring 2011. Furthermore, longitudinal
assessment of the program was impractical. Due to personnel changes, PT faculty have been teaching our two "gateway" courses for past two
July 18, 2011, Bonnie Paller
years. They could not be asked to do assessment. Concurrently, curriculum changes that went into effect in Fall 2010 deleted the capstone
sequence except for Honors students (this change was based on assessment). We failed to adequately design and assess an alternate capstone
experience.
2. Student Learning Outcome Assessment Project: Answer questions according to the individual SLOs assessed this year. If you assessed more
than one SLO, please duplicate this chart for each one as needed.
2a. Which Student Learning Outcome was assessed this year?
Undergraduate Program: • Students can demonstrate conceptual understanding of different earth materials and the processes that
shape them throughout their history.
2b. What assessment instrument(s) were used to gather evidence about this SLO?
Pilot use of the Geoscience Concept Inventory (Libarkin & Anderson, 2005). This nationally recognized assessment in our field has been
rigorously validated and used throughout the country. The GCI tests students’ ability to synthesize their background knowledge to answer
questions about novel situations. It assesses the “understanding” and “applying” levels of Bloom’s taxonomy that are typically covered in
introductory survey courses. The GCI is multiple choice and scoring is automated in Moodle. An example item, “If you could travel millions of
years into the future, how big would the planet Earth be?” (this item tests students conceptual understanding of plate tectonics, a fundamental
Earth process. It is unlikely that such a question would have been posed to them in a textbook, so it probes above the base level of
“remembering” from Bloom’s taxonomy).
2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type of
students, which courses, how decisions were made to include certain participants.
All students in GEOL 106LRS, about 76 students, were asked to complete the GCI outside of class twice during the semester (pre/post). The pretest was characterized as a survey of background knowledge and the post-test was presented to students as required review for the final exam.
2d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different points) or was
July 18, 2011, Bonnie Paller
a cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.
In the pilot study, the SLO was measured longitudinally during a single geoscience course in order to “tune” the assessment. Because of the wide
breadth of the field of geoscience, the GCI offers a menu of questions with similar difficulty that cover slightly different topic areas. The test
administrator can select certain combinations of questions that are appropriate for the content of the course or program being assessed. It is
essential to choose questions that align with the program learning objectives, and not topics that are outside the scope of our program.
In future years, we plan to implement the GCI across the program in all introductory classes with a post-test again in classes typically taken the
sophomore or junior year. We expect that >75% of our B.S. majors will be able to score >75% on the GCI by this time. We will not administer the
GCI in higher-level courses; assessments will instead focus on the higher order skills from Bloom’s taxonomy that students develop in their upper
division coursework.
2e. Assessment Results & Analysis of this SLO: Provide a summary of how the evidence was analyzed and highlight important findings from the
collected evidence.
In our pilot study, we assessed learning only at the course level. We calculate the normalized gain of students:
(Post-Pre) / (100% - Pre).
This is a measure of how much students improved compared to how much they COULD have improved, based on their pre-test score. It reduces
“ceiling” effects common in gain analyses. (Figure 1 at the bottom of the document)
• Two sections of the same course taught by the same instructor had significantly different results: 14% v 27% mean normalized gain.
Students had statistically identical pre-test scores.
• The class was implemented with team-based learning, and the team sizes were different between the two classes (average 4.5 students
v. 5.5 students). The class that showed greater improvement had more students per team. It is possible that smaller teams did not reach
a critical mass.
• The morning class performed poorly. Time of day is a known factor in course performance (note that the assessment was performed out
of class for both courses).
• Normalized gain versus pre-test score plots reveal that students with lower background knowledge at the beginning of the semester
show much greater normalized gains than students with better background knowledge. This indicates that the course is targeted
towards students with lower background knowledge. While perhaps appropriate for an introductory GE class, more advanced students
are not learning very much from the course. If we continue to see the same trends in our longitudinal program assessments, we will
need to develop more enriching activities for more advanced students.
Students scored uniformly low in certain content areas of the GCI. These topics were not well addressed during the introductory survey course.
Either the course needs revision in these content areas or the program must ensure that such content is required in future courses.
July 18, 2011, Bonnie Paller
After only an introductory course, few students score above 70% (See Figure 2 at the bottom of the document). Our goal is for most students to
score at least 75% on the assessment, consistent with the national average of geoscience graduates at peer institutions.
2f. Use of Assessment Results of this SLO: Think about all the different ways the resulting evidence was or will be used to improve academic
quality. For example, to recommend changes to course content/topics covered, course sequence, addition/deletion of courses in program,
student support services, revisions to program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc.
Please provide a clear and detailed description of how the assessment results were or will be used.
Our analysis of the pilot study shows that team may affect student outcomes in the introductory course. We plan additional analysis to ensure
that the general trend for the class holds up on a team-by-team basis. If it does, we will construct the teams such that the optimal size is
achieved.
When we implement this survey across all introductory classes, we can look at trends in the gain v. pre-test scores to make sure that the course
is useful to students with strong background knowledge, since they will make our most effective majors.
When we implement the longitudinal assessment of students in the program, we will be able to identify specific areas of programmatic
weakness. The GCI gives subscores broken down topic-by-topic, so we can correlate low and high performance with specific courses that cover
the material.
2a. Which Student Learning Outcome was assessed this year?
Graduate Program: “Develop and write a professional quality proposal to conduct an original research thesis in an area of geological research.”
(*This SLO is listed in our internal department documents, but SLO’s for our graduate program have not made it into the catalog)
2b. What assessment instrument(s) were used to gather evidence about this SLO?
We are in the “data collection” stage of developing a rubric because the requirement that all first year graduate students must do a written
research proposal is relatively new in the program. We have compiled the proposals and are coding them for commonalities in order to design a
sufficiently detailed rubric for assessment.
July 18, 2011, Bonnie Paller
2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type of
students, which courses, how decisions were made to include certain participants.
All new graduate students, as part of a required course, GEOL 694 (Graduate Thesis Design seminar). In Fall 2010, there were 7 students in the
course.
2d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different points) or was
a cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.
We treat the research proposal as an individual milestone for every student; we expect all graduate students to achieve this SLO before
beginning their thesis research.
2e. Assessment Results & Analysis of this SLO: Provide a summary of how the evidence was analyzed and highlight important findings from the
collected evidence.
None yet.
2f. Use of Assessment Results of this SLO: Think about all the different ways the resulting evidence was or will be used to improve academic
quality. For example, to recommend changes to course content/topics covered, course sequence, addition/deletion of courses in program,
student support services, revisions to program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc.
Please provide a clear and detailed description of how the assessment results were or will be used.
Deficiencies in proposals can be addressed by improving the content presented in GEOL 694, a seminar course devoted to research design. Since
graduate students work with their individual advisors on research projects, the advisor and thesis committee can implement appropriate
interventions, as needed. For example, students take GEOL 590 Literature Seminar in the same semester. If a finding from GEOL 694 suggests
that students have difficulty reading technical literature, GEOL 590 can help address the problem by providing strategies and practice in reading
journal articles.
3. How do this year’s assessment activities connect with your program’s strategic plan and/or 5-yr assessment plan?
July 18, 2011, Bonnie Paller
During the last year, we laid the groundwork for a complete re-envisioning of our department’s vision for the future. Most notably, we secured a
large financial gift to redesign our curriculum from the ground up. The gift provides faculty release time to devote sincere thought and effort to
curricular reform over the next four years. Rethinking our SLO’s will be a starting point for the project, and assessment will be an integral part of
the curriculum redesign. Details about the plan are included in item 5, below.
4. Overall, if this year’s program assessment evidence indicates that new resources are needed in order to improve and support student
learning, please discuss here.
5. Other information, assessment or reflective activities not captured above.
More details about our 4-year curriculum reform effort:
The planned 4-year-arc of curriculum reform begins with a planning phase, followed by design and implementation phases. The department has
set aside funds to work with a consultant from the On the Cutting Edge, a partnership between National Association of Geoscience Teachers and
NSF. This expert will facilitate a discussion about innovative teaching methods and program design. One faculty will be the liaison with Cutting
Edge and will conduct research on geology programs across US. He or she will prepare a report that will lay the groundwork for the design phase
of curricular reform in 2012-2014. Most of the design of the new programs and courses will take place in Year 2. Reform efforts in Year 3 will
likely be evenly divided between design and piloting implementation of some of the new pedagogy.. Implementation of remaining new course
designs will occur in Year 4. Assessment of the program will be integrated with the curriculum re-design.
July 18, 2011, Bonnie Paller
6. Has someone in your program completed, submitted or published a manuscript which uses or describes assessment activities in your
program? Please provide citation or discuss.
Figure 1
July 18, 2011, Bonnie Paller
Figure 2
Download