Annual Assessment Report to the College 2010-2011

advertisement
Annual Assessment Report to the College 2010-2011
College: __Science and Mathematics____________________
Department: ___Chemistry and Biochemistry____________
Program: ______Chem BA, BS/MS, Biochem BS/MS_______________
Note: Please submit report to your department chair or program coordinator and to the Associate Dean of your College by September 30, 2011.
You may submit a separate report for each program which conducted assessment activities.
Liaison: _____Thomas Minehan_______________________
1. Overview of Annual Assessment Project(s)
1a. Assessment Process Overview: Provide a brief overview of the intended plan to assess the program this year. Is assessment under the
oversight of one person or a committee?
Assessment in our department is under the oversight of a four-person committee, currently comprised of Curtis, Schrodi, Minehan (liaison), and
Nguyen-Graff.
The following assessment activities were planned for this year:
a. Implement rubrics for the assessment of (SLO4b): Work effectively in a laboratory environment, including the ability to maintain a
proper lab notebook.
b. Develop a rubric for the evaluation of written research/scientific reports (SLO2, 3)
c. Assess basic knowledge in biochemistry and general chemistry (SLO1) using standardized exam questions in course finals.
d. Review results from the implementation of a rubric for the assessment of (SLO2): classroom presentations and graduate student
seminars.
e. Assess students’ ability to perform Quantitative Chemical Analysis (SLO6).
f. Evaluate previously developed and implemented Personnel Procedures for non-tenure track faculty
1b. Implementation and Modifications: Did the actual assessment process deviate from what was intended? If so, please describe any
modification to your assessment process and why it occurred.
Although a “customizable” lab notebook rubric has been crafted for the department, the adoption/use of this rubric by lab coordinators has
occurred in only a limited number of courses, and this data is presented below. As a result, we will further publicize the rubric and evaluate
SLO4b in next year’s assessment report as well. Assessment of lab notebooks in the course Chem 321L has occurred for several years without
the new rubric, and these results will also be presented below.
2. Student Learning Outcome Assessment Project: Answer questions according to the individual SLO assessed this year. If you assessed an
additional SLO, report in the next chart below.
2a. Which Student Learning Outcome was measured this year?
SLO 1: Assess basic knowledge in the following areas of chemistry: general chemistry and biochemistry.
2b. What assessment instrument(s) were used to measure this SLO?
For general chemistry (Chem 101): 25 multiple-choice questions from an ACS standardized exam in general chemistry were embedded in the
course final. Relevant questions were chosen from the ACS general chemistry exam, since the comprehensive exam covers topics not presented
in Chem 101.
For biochemistry (Chem 464): 5 multiple-choice questions from an ACS standardized exam in biochemistry were embedded in the course final.
2d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different points) or was
a cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.
A cross-sectional design methodology, in which freshman are compared with seniors, was used:
General Chemistry: Chem 101 is our gateway course taken by both majors and non-majors, and is typically populated by freshman. The
foundational concepts introduced in this course are crucial for student success in all subsequent courses in the major. Thus, assessment at this
introductory stage will allow us to establish a baseline level of student performance useful for comparison with assessments done in later
courses in our undergraduate program.
Biochemistry: for the chemistry seniors who make up a significant portion of the students in Chem 464, this class typically would be one of the
last courses they take in the major, and thus the assessment in this course may provide information on how our students have been able to both
understand and apply fundamental concepts in chemistry they have learned throughout the program.
2e. Assessment Results & Analysis of this SLO: Provide a summary of how the data were analyzed and highlight important findings from the
data collected.
General Chemistry: it was found that 50% of the students in the section performed above the national average, and the average score for the
section was slightly better than the national average (58.3% vs. 56.4%). In addition, there was a strong correlation between the results on the
assessment questions and the overall course grades. This may indicate that the course difficulty is roughly at the correct level compared to
national courses. Thus, at the introductory level our students appear to be on par with their peers across the country in terms of their grasp of
foundational chemical concepts.
Biochemistry: 55% of the students taking Chem 464 (22/40) got 3 or more out of the 5 assessment questions on the final exam correct (a score
of 3/5 has been identified as a benchmark level for student success). The average class score was 54% (2.7/5 correct). While it is encouraging
that more than half of the class has achieved the benchmark success level, there are no previous results to compare this performance to since
this was the first semester of assessment for biochemistry. Taking the national average as a benchmark for success in Chem 101, a slightly
greater percentage of students in Chem 464 are achieving the benchmark level than in Chem 101. With future assessments and more data, we
would hope to see an even greater percentage of students in one of our capstone courses achieve the benchmark level of performance.
2f. Use of Assessment Results of this SLO: Think about all the different ways the results were or will be used. For example, to recommend
changes to course content/topics covered, course sequence, addition/deletion of courses in program, student support services, revisions to
program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc. Please provide a clear and detailed
description of how the assessment results were or will be used.
Based on these results, we have:
1.) Established that in order to improve the percentage of students achieving the benchmark level of success in our assessments, critical
thinking and problem-solving skills need be emphasized not only in introductory but also in upper division courses. Since lecture courses
are driven by direct information transfer, often the best forum for achieving this emphasis on critical thinking is discussion/recitation
sessions or in-class exercises where students work on assigned exercises and practice problems in groups and present their results to
the class.
2.) Initiated the process of establishing a required recitation course for Organic Chemistry I. The results of previous assessment activity in
our organic courses have demonstrated the strong need for such problem-solving sessions. Optional sessions are currently in place, and
once the mandatory session for Organic Chemistry I is established we will re-assess student performance in the organic sequence to
evaluate the impact of recitation on student performance.
3.) Archived this data for future comparison. If the mandatory recitation sessions in organic chemistry lead to an improvement in the
percentage of students achieving the benchmark level of success on our assessments, then appropriate changes in
instructional/teaching methods in other upper division (300 or 400 level) chemistry courses may be brought up for consideration by the
faculty.
2. Student Learning Outcome Assessment Project: Answer questions according to the individual SLO assessed this year. If you assessed an
additional SLO, report in the next chart below.
2a. Which Student Learning Outcome was measured this year?
SLO 2m: Organize and communicate scientific information clearly and concisely, both verbally and in writing.
2b. What assessment instrument(s) were used to measure this SLO?
A rubric to evaluate the graduate student seminars was crafted and approved by the department. Beginning spring 2010, this rubric was used by
department faculty in evaluating graduate student seminars.
2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type of
students, which courses, how decisions were made to include certain participants.
The sample included all graduate students giving either a literature seminar (Chem 691) or thesis seminar (Chem 692) for the Fall semester 2010
and Spring/Summer semesters 2011. In Fall 2010 there were 3 literature seminars and 4 thesis seminars. In Spring/Summer 2011 there were 4
thesis seminars and 4 literature seminars.
2d. Describe the assessment design methodology: For example, was this SLO assessed longitudinally (same students at different points) or was
a cross-sectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.
Graduate students giving literature seminars are at (approximately) the half-way mark in their progress towards the master’s degree; they have
taken several upper division courses, some or most of which require in-class oral presentations. Graduate students giving their thesis seminar
have completed all required graduate coursework as well as experimental work. Thus, it is hoped that the sequential evaluations of literature
and thesis seminars for individual students will allow an evaluation of the growth of a student’s oral communication abilities.
2e. Assessment Results & Analysis of this SLO: Provide a summary of how the data were analyzed and highlight important findings from the
data collected.
The scoring rubrics had five categories: organization, understanding of scientific content, style/delivery, use of visual aids, and ability of
answer questions, and performance in each category could be rated with a score of 0-20 points. The rubric provided descriptions for excellent
(16-20 points), good (11-15 points), marginal (6-10 points), and inadequate (0-5 points) performance. Faculty attending the seminars filled out
the rubrics and forwarded them to the seminar coordinator. The seminar coordinator tabulated the results for each category and an average
score for literature seminars and thesis seminars was obtained. Fall 2010 results: organization, average scores 18.3 (lit) and 17 (thesis);
understanding of scientific content, average scores 17.3 (lit) and 17 (thesis); style and delivery, average scores: 17.3 (lit) and 16.75 (thesis); use
of visual aids, average scores: 17.3 (lit) and 16.75 (thesis); and ability to answer questions, average scores: 15.7 (lit) and 15.75 (thesis).
Spring/Summer 2011 results: organization, average scores 18.75 (lit) and 18.75 (thesis); understanding of scientific content, average scores
18.25 (lit) and 18.5 (thesis); style and delivery, average scores: 17.75 (lit) and 18.25 (thesis); use of visual aids, average scores: 17.25 (lit) and 18
(thesis); and ability to answer questions, average scores: 17.25 (lit) and 17 (thesis). The results indicate that, on the whole, the graduate
students are doing well in their oral seminars, since the average scores in most categories are in the 17-18 range. The weakest category in all
cases is the ability to answer questions, with average scores ranging from 15.7-17.3. In the past this has been attributed to a lack of depth of
scientific understanding on the part of the student; however, average scores on the understanding of scientific content category (17-18.5) do
not necessarily corroborate this idea.
Compared to the data collected in Spring 2010, average scores in literature seminars have improved while average scores in thesis
seminars have gone down: Organization, up 0.8 points for lit, down 0.35 for thesis; understanding, up 0.65 for lit, down 1.0 for thesis; style, up
2.0 for lit, did not change for thesis; visuals, up 0.6 for lit, down 0.85 for thesis; questions, up 1.2 for lit, down 0.9 for thesis. It is thought that
the reason for this trend is that faculty are now following the scoring rubric more closely than before (vide infra).
2f. Use of Assessment Results of this SLO: Think about all the different ways the results were or will be used. For example, to recommend
changes to course content/topics covered, course sequence, addition/deletion of courses in program, student support services, revisions to
program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc. Please provide a clear and detailed
description of how the assessment results were or will be used.
Based on these results, we have:
1.) Advised students to practice their literature and thesis seminars before one or two faculty members who can provide questions and feedback
prior to their actual presentation may also improve their ability to answer questions. This suggestion will be given to students in the information
form they receive before preparing their seminars and can be reiterated by both the seminar coordinator and supervising faculty. These
strategies will be implemented in Spring 2012.
2.) Crafted a new rubric scoring sheet in which faculty need to write in a score for each category (rather than an overall score), requiring them to
think more specifically about what to give in each instance. Previously, the seminar coordinator suggested that perhaps the high average scores
in each category indicate that faculty are not grading strictly according to the rubric or are unwilling to give poor grades to students on the
thesis/literature presentations. It is felt that this recent simple change in the rubric scoring sheet has resulted in faculty scoring more accurately,
as reflected in the greater similarity of the average scores for literature and thesis seminars in the 2010-2011 academic year.
2a. Which Student Learning Outcome was measured this year?
SLO4b: Work effectively in a laboratory environment, including the ability to maintain a proper lab notebook.
2b. What assessment instrument(s) were used to measure this SLO?
A general rubric to evaluate the laboratory notebooks of undergraduate students taking laboratory courses in our program was crafted. The
rubric was distributed to all faculty at the beginning of the fall 2010 semester with the intent that individual course coordinators could modify
the rubric at will to suit the needs of their individual courses. Categories in the rubric included pre-lab preparation, in-lab qualitative
observations, tabulation of data and results, and post-lab reflection/conclusion. A desired benchmark score of 75% has been set for this rubric.
Prior to the development of this rubric, instructors in Chem 321L have evaluated the quality of each student’s laboratory notebook for many
years. The focus of the 321L notebook evaluation, while covering the essential points of the new rubric above, is on completeness and the
evaluation tends to be more detailed and comprehensive. The desired benchmark score for this evaluation has thus been set at 70%.
2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type of
students, which courses, how decisions were made to include certain participants.
At this initial stage of implementation of the new rubric, only a few instructors have adopted its use and data is available only from Organic
Chemistry II lab. Notebooks were collected at the end of Spring 2011 semester from all Chem 334L sections (a total of 162 students) and 23
notebooks of chemistry/biochemistry majors and minors were reviewed by course teaching assistants.
All chemistry and biochemistry majors and minors take Chem 321L. Notebook data from four sections were analyzed: Spring 2005 (17 students),
Fall 2006 (18 students), Fall 2007 (17 students) and Fall 2009 (15 students). The evaluation is done in two parts, with the first check made during
week 6 and the second check done at the end of the semester.
2d. Describe the assessment design methodology: Was this SLO assessed longitudinally (same students at different points) or was a crosssectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.
Organic chemistry II lab is taken by students after a full semester of Organic Chemistry I lecture and lab. The course is taken approximately halfway through the program for chem/biochem majors, and a significant percentage of students in this course are biology majors. Since by this
time the students would have taken two general chemistry labs (Chem 101L and Chem 102L) and one organic chemistry lab (Chem 333L), we
feel that assessment at this intermediate point through the program should provide valuable information about how well our students have
learned to keep a proper lab notebook, which is an essential skill for practicing scientists.
Students taking Chem 321L generally have completed General Chemistry and quite a few have also completed Organic Chemistry. As such these
students are approximately mid-way through the program, and thus it may be expected that they have developed some fundamental skills in
notebook keeping that may be assessed at this intermediate time point.
2e. Assessment Results & Analysis of this SLO: Provide a summary of how the data were analyzed and highlight important findings from the
data collected.
For the 334L notebooks reviewed, an average score of 15/20 (75%, the benchmark level) was obtained from rubric data gathered from 4
teaching assistants. Of the five categories the students were evaluated in, the weakest category (with an average of 2.4/4 points) involved
recording in-lab qualitative observations. The teaching assistants noted that student mistakes during an experiment (such as solvent/reagent
spillages during extractions, overheating/underheating reaction mixtures, etc) were often not recorded in the notebooks. A further weak
category (with an average 2.7/4 points) was post-lab reflection/conclusion. Some teaching assistants noted that while observations and
conclusions were often omitted altogether, detailed procedures carefully copied from the lab manual were included in notebook entries for
each experiment. While the overall notebook-keeping ability of our majors and minors may be judged satisfactory at this intermediate point
through the program, students need to concentrate more on the process of recording their data, observations and conclusions in their notebook
during lab, rather than on doctored notebook mechanics and cleanliness out of lab.
For Chem 321L Spring 2005, 41.2% of students achieved the benchmark score of 70% at check 1 and 29.4% achieved benchmark at check 2. For
Fall 2006, 16.7% achieved benchmark at check 1 and 61.1% achieved benchmark at check 2. For Fall 2007, 35.3% achieved benchmark at check 1
and 26.7% achieved benchmark at check 2. Finally, for Fall 2009, 40% achieved benchmark at check 1 and 26.7 % achieved benchmark at check
2. These results suggest the following: 1.) the sometimes large variation in scores seems to reflect the small class size and its varying skill level;
2.) sometimes there is a significant improvement throughout the semester, however, when this does not occur it may reflect student attitude in
a particular section; 3.) although the class average occasionally exceeds the benchmark level, improvement is needed in the way students
maintain a laboratory notebook in Chem 321L
2f. Use of Assessment Results of this SLO: Think about all the different ways the results were (or could be) used. For example, to recommend
changes to course content/topics covered, course sequence, addition/deletion of courses in program, student support services, revisions to
program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc. Please provide a clear and detailed
description of each.
Based on these results, we have:
1.) Established benchmark levels for student success overall (75%) and in each category (3/4) of the new rubric.
2.) Re-emphasized the importance of the in-lab recording of experimental observations in all laboratory courses, starting at the 100 level.
Although sample notebook entries are provided in course lab manuals, teaching assistants will be encouraged to make it a point to review what
a proper lab notebook entry for an experiment looks like on the first day of class. This will be communicated to course coordinators at the
beginning of each semester from now on.
3.) Suggested that TA’s do notebook spot-checks periodically through the semester (with points assigned) in order to remind students of the
importance of in-lab qualitative observations. This practice has been in place for a while in the organic lab sequence (Chem 333L and Chem
334L), but assigning a greater number of points to the spot-checks may reflect the importance of notebook maintenance in the overall lab
course grade. If successful, this strategy could be employed in other lab courses in our program.
4.) Considered providing Chem 321L students with more detailed examples of a proper lab notebook, as well as making the notebook score a
more significant part of the course grade to further emphasize the importance of this skill.
2a. Which Student Learning Outcome was measured this year?
SLO6: Perform Quantitative Chemical Analysis
2b. What assessment instrument(s) were used to measure this SLO?
The emphasis in Chem 321L is on quantitatively analyzing samples using a variety of wet chemical and instrumental techniques. Students
perform 8-9 analyses each semester and receive a score between 5-10 based on how close their numerical result is to the accepted value. The
level of accuracy required for a top score depends on the technique(s) used, however, a grade of 10 generally requires a relative error of only a
few parts per thousand. The acceptable benchmark performance has been set at a class average of 80% for each experiment.
2c. Describe the participants sampled to assess this SLO: discuss sample/participant and population size for this SLO. For example, what type of
students, which courses, how decisions were made to include certain participants.
All chemistry and biochemistry majors and minors take Chem 321L. Each section of Chem 321L contains 15-18 students. Data were collected
from four sections: Spring 2005 (17 students), Fall 2006 (18 students), Fall 2007 (17 students), and Fall 2009 (15 students)
2d. Describe the assessment design methodology: Was this SLO assessed longitudinally (same students at different points) or was a crosssectional comparison used (comparing freshmen with seniors)? If so, describe the assessment points used.
Students taking Chem 321L generally have completed General Chemistry and quite a few have also completed Organic Chemistry. As such these
students are approximately mid-way through the program, and thus it may be expected that they have developed some fundamental skills in
quantitative chemical analysis that may be assessed at this intermediate time point.
2e. Assessment Results & Analysis of this SLO: Provide a summary of how the data were analyzed and highlight important findings from the
data collected.
For Spring 2005 average scores on experiments 1-8 ranged from a low of 76 (experiment 5) to a high of 94 (experiment 4). For Fall 2006, average
scores on experiments 1-8 ranged from a low of 68 (experiment 5) to a high of 95 (experiment 7). For Fall 2007, average scores on experiments
1-8 ranged from a low of 79 (experiment 5) to a high of 95 (experiments 4 and 7). For Fall 2009, average scores on experiments 1-9 ranged from
a low of 81 (experiments 6 and 7) to a high of 93 (experiment 8). It is noteworthy that experiment 5 has historically resulted in much poorer
scores with the average generally falling below the 80% benchmark. Overall, however, this history of assessment strongly suggests that the
average student in Chem 321L is capable of performing careful quantitative analyses using a variety of techniques
2f. Use of Assessment Results of this SLO: Think about all the different ways the results were (or could be) used. For example, to recommend
changes to course content/topics covered, course sequence, addition/deletion of courses in program, student support services, revisions to
program SLO’s, assessment instruments, academic programmatic changes, assessment plan changes, etc. Please provide a clear and detailed
description of each.
Based on the results for Spring 2005, Fall 2006, and Fall 2007, which pointed to a problem with experiment 5 (i.e., student results always fell
below the benchmark class average of 80%), we have implemented a new experiment (starting in Fall 2009) that avoids problems related to the
equipment used for the previous experiment 5. The results (as can be seen for the Fall 2009 data above) are now more in line with other
experiments. This change also provided additional lab time so that a further new experiment (#9) was added in Fall 2009.
3. How do your assessment activities connect with your program’s strategic plan?
Department assessment most closely aligns with the plan to improve the quality of teaching in our Department. By identifying areas of
weakness for our students with respect to our SLO’s, our full time faculty can make necessary changes to course content and instructional
approaches or suggest new preparatory, pre-requisite or co-requisite courses (such as discussion or recitation sessions) to assist students in
gaining the necessary foundational knowledge so that they can progress through the core courses in their degree. As mentioned above, we
have recently initiated the process of creating a mandatory recitation session for our Organic Chemistry I course, since our assessment results
clearly indicated a need for our students to practice problem solving and acquire critical thinking skills. We anticipate that this program
modification will greatly improve student learning and allow students to progress through to the upper division courses in a more timely
fashion. Furthermore, the assessment activities described in section 5b below help ensure that the teaching activities of our non-tenure track
faculty are of a high standard.
4. Overall, if this year’s program assessment evidence indicates that new resources are needed in order to improve and support student
learning, please discuss here.
Assessment during 2010-2011 did not indicate the need for new resources.
5. Other information, assessment or reflective activities not captured above.
a.) One of the members of our departmental assessment committee (Yann Schrodi) has developed a rubric for assessing written assignments in
chemistry, relevant to department student learning outcomes 2 and 3. Categories in the new rubric include Abstract/Introduction/Theory,
Materials and Methods, Discussion, Conclusions, References, Grammar, and Formatting. It is hoped that the use of this new rubric will be
implemented in the Fall 2011 semester, especially for evaluating student research reports in Chem 499 and Chem 495.
b.) Several years ago the Department developed and adopted a specific set of guidelines for the evaluation of non-tenure track faculty. The
department continues to thoroughly review and provide feedback on the teaching of all non-tenure track faculty who teach lecture sections for
the Department. This evaluation consists of a class visit, a comprehensive review of all written instructional materials including syllabi, quizzes,
handouts, and tests, and an examination of student evaluations for all lecture sections taught. The evaluation report is discussed with each
faculty member under review by a member of the Department Evaluation Committee, and each faculty member is encouraged to share this
information with his or her faculty mentor.
The Department is in the third year of this annual evaluation process. In most instances the faculty under review have improved in areas of
concern noted in their evaluation. This process did identify an instructor who was not re-hired by the Department because of a failure to address
concerns that were repeatedly raised.
6. Has someone in your program completed, submitted or published a manuscript which uses or describes assessment activities in your
program? Please provide citation or discuss.
No
Download