Department-Designed Course Outcomes Assessment

advertisement
Overview and History of
Department-Designed Course Outcomes Assessment
at Prince George's Community College
NC State Assessment Symposium April 2004
By William Peirce wpeirce@verizon.net
1999 PGCC began its assessment of course outcomes in fall 1999, when we attempted to assess
general education by collecting and scoring course-embedded assignments from 13 heavily
enrolled general education courses. Our two-year experiment did not tell us what we wanted
to know about general education, so we dropped it in favor of the Academic Profile test for
assessing general education. Our experience did help us prepare to assess course outcomes,
which is now a successful ongoing assessment process.
2000-01 All departments revised their course master syllabi in half their courses to ensure that


Course objectives are stated as assessable learning outcomes in behavioral terms:
("After successfully completing this course, the students will be able to . . .")
Assignments and test questions address learning outcomes and are assessed by criteria
Revised master syllabi were reviewed by the faculty Academic Outcomes Assessment
Committee (AOAC).
2001-02 All departments did the same for the remaining half of their courses.
2002-03 Departments made a five-year schedule for assessing priority courses and designed a
course assessment plan for the first course to be assessed. The department schedule provides
for one semester to plan the course assessment, the next semester to implement the
assessment, and the third semester to analyze and discuss the results. Each semester a new
course begins the three-semester course assessment cycle or a previously assessed course is
revisited.
The departments’ assessment plans have three components:
1. A process to ensure that all instructors, including adjunct faculty, are assessing all the
course outcomes
2. A systematic, valid assessment of at least 60% of the course outcomes by assignments
and/or tests given in all sections of the course or a representative ¼ sample in large
multi-section courses
3. An analysis of the assessment results and a discussion by the faculty of how to
improve student learning
2003 until the end of time Departments continue their schedule of assessing priority courses,
revisiting previous courses as appropriate.
1
As of fall 2003:
425 course syllabi were rewritten as assessable outcomes and approved by the AOAC
18 course assessments were finished. All 18 departments said that some component of the
assessment process was useful in improving the course.
46 course assessments were in progress.
Principles Guiding the Development of Assessment of Course Outcomes
1. The primary purpose of assessment is to provide valid and reliable data to improve
student learning.
2. Faculty design and carry out the assessment process.
3. Assessment is based on course-embedded tests and assignments, although additional
measures can be used.
4. Assessment is not linked in any way to evaluating individual faculty members.
5. Assessment results are analyzed each semester by department faculty to improve student
learning.
6. Assessment at PGCC is an evolving process, improving each year.
Principles Guiding the Development of Assessment of Learning Outcomes
—With Commentary
1. The primary purpose of assessment is to provide valid and reliable data to improve student
learning.



NOT for "accountability"
NOT to use national standardized tests as benchmark to compare our students.
"Valid and reliable data" means common test questions and common essay
instructions among all sections being assessed. We distinguish "class assessment"
from "course assessment." Class assessment is an instructor grading the students'
work in his or her section. Course assessment means common test questions and/or
common essay instructions among all sections being assessed.
2. Faculty design and carry out the assessment process.
NOT designed, dictated, or even looked at by the institutional assessment office unless
requested. The faculty Academic Outcomes Assessment Committee, with representatives
from each division, reviews the course assessment plans sent by the departments.
3. Assessment is based on course-embedded tests and assignments, although additional
measures can be used.
NOT by nation-wide tests from vendors unless the department chooses one (e.g. nursing,
engineering)
2
4. Assessment is not linked in any way to evaluating individual faculty members.
NOT part of the formal faculty evaluation process.
5. Assessment results are analyzed each semester by department faculty to improve student
learning.
See item 1, "The primary purpose of assessment is to provide valid and reliable data to
improve student learning."
1. Assessment at PGCC is an evolving process, improving each year.



For example, this spring the AOAC added a critical thinking component to the course
assessment process, to start in fall 2004.
We comply with guidelines issued by the state of Maryland and the Middle States
Commission.
We obtain feedback about the assessment program and use it to improve assessment
at PGCC.
Challenges We Face(d)
1. Syllabi listed objectives as topics to cover; faculty didn't know how to write assessable
outcomes.
Remedy: Virginia Anderson conducted a workshop in fall 1999 for department chairs and
about 60-70 faculty; the AOAC banned "understand" and "demonstrate" (unless students
actually did a demo) from the list of outcomes on course master syllabi; we provided lists of
verbs for Bloom's higher order thinking outcomes; an AOAC member was appointed as
liaison to each department and helped the faculty write assessable outcomes; the AOAC
made suggestions when it disapproved syllabi.
2. Some courses listed only factual recall outcomes, no higher order thinking.
Remedy: Until now, the AOAC decided not to dictate course content. Instead of the AOAC
rejecting the outcomes, the assistant to the vice-president tactfully pointed out the lower-level
outcomes to the department and encouraged writing higher-order thinking outcomes. Starting
in fall 2004, the course assessment plan will have to include an assessment of higher order
thinking outcomes or the AOAC will not approve it.
3. "Course outcomes assessment is too much work and not worth the effort."
At first the vice president for instruction wanted all outcomes assessed in all courses (which
really is too much work).
3
Remedy: Assess only 60% of the outcomes in priority courses. Start with an easy course to
assess (few sections, faculty get along well). Assess one course in a semester, but have three
courses in one of three stages: plan, implement, analyze and discuss results. Consult with
liaison on AOAC or with coordinator. Allow four semesters for difficult or time-consuming
assessment plans.
4. Faculty knew how to do only class assessment, and had no experience with course
assessment.
Remedy: Conducted two-morning workshop in May 2002 for department chairs and selected
faculty on how to use common course-embedded test questions and assignments. Wrote an
assessment handbook matched to the items on the Course Assessment Plan. Revised forms to
be more clear. Liaison on AOAC or coordinator consulted with faculty.
5. Test items did not match the outcomes on the syllabus.
Remedy: AOAC disapproved the plan; faculty revised and resubmitted the test questions
and/or revised the course outcomes.
6. Test items asked for only factual recall, not higher order thinking.
Remedy: In the past, the vice-president's assistant tactfully pointed this out to department and
encouraged higher-order thinking. Starting in fall 2004, the course assessment plan must
include assessment of higher order thinking outcomes or the AOAC will not approve it.
7. Faculty did a poor job of assessment: essays written on different topics under different
conditions; faculty scoring essays did not norm themselves.
Remedy: Coordinator of AOAC tactfully pointed out why the results were not reliable in a
conversation with the chair of the department. It's hard to make resistant faculty care about
assessment when it is a nuisance and seems to provide few benefits. For one department, the
only benefit of the assessment process was the exchange of assignments among faculty
during the planning stage.
8. Common assessment is a violation of individual academic freedom.
Remedy: Continual personal dialogue by sympathetic coordinator. Allow as much freedom
(classroom autonomy) as possible, consistent with broad goal of assessment, which is to use
evidence about student performance to guide an informed discussion by the faculty.
Inevitable Tensions in the Outcomes Assessment Process
1.
2.
3.
4.
Educational research data vs. action research data
Assessment for accountability vs. assessment for improvement of student learning
Primacy of the assessment process vs. primacy of instructor autonomy
Primacy of the assessment process vs. need for faculty cooperation and acceptance
4
For example, to assess EGL 101, first-semester composition, the English department is not
assessing portfolios—although that is probably the best way to measure growth in writing ability.
Instead we are using impromptu essays written during the final exam period in a representative ¼
sample of sections. Assessing portfolios would produce better educational research data but the
English faculty need only data that is good enough to make decisions about how to improve the
course. Assessing portfolios would require faculty who don't use portfolios to use them (which
disrespects their autonomy and makes them less willing to cooperate), and it is too laborintensive.
At PGCC the Academic Outcomes Assessment Committee seeks compromises with the
department when these conflicts arise in reviewing a department's assessment plans and has a
bias towards the second of each of these four dichotomies.
Sample Statements from the Departments' Reports
About Their Assessment Process
Excerpt from 2002-03 year-end report to Dr. Vera Zdravkovich, Vice President of Instruction, by
William Peirce, coordinator of academic outcomes assessment:
The highlight of the 2002-03 year was the high number of departments whose faculty reported
that their first course assessment actually resulted in improvements in the course. The following
is a summary of highlights from the departments' reports on their assessment process.
Uses of Assessment Results by Departmental Faculty
Accounting 201, Intermediate Accounting
"The group identified three areas for improvement: outcomes, the design and
implementation of assessment, and outcomes that students had not mastered. Two of the
outcomes were so closely related that they will be merged. In the design of the materials
the breadth and depth of coverage of outcomes varied between sections. The group felt
this distorted the assessment and plans to use the same assessment materials in all
sections in the future. Poor communication with the adjunct faculty member whose
section was being assessed resulted in incomplete data. Finally, areas of student weakness
were identified and additional time will be devoted to those topics and new techniques
will be tried to improve student learning."
Art 271, Art Survey II
For the art faculty, the assessment process resulted in careful writing of assessment
instruments and full discussion of the results. "The assessment was helpful in that the
instructors worked closely together and discussed the course thoroughly with one
another. The department carefully defined the course learning outcomes and examined
5
the assessment results. While the assessment did not significantly alter the teaching
method, it helped give a more uniform approach to the course content."
Biology 101, General Biology
The pre-test and post-test assessment raised many questions among the faculty teaching
the course about the assessment instruments and the interpretation of results. Their report
states, "At this point, the 101 assessment team has invested quite a substantial amount of
time and energy into this opening phase of the assessment project, and has generated a
fair amount of not only data but also insights into the question of how to asses learning in
light of specific expected outcomes. . . . The time and resources that we shall invest in the
future will be far better spent if we are able first to enlist some professional help in
specific areas such as test question design, statistical interpretation, and biological
science pedagogy."
Computer Information Systems 185, Web Site Design and Implementation
The results indicated no need to improve the course, but the participating faculty
recognized the value of improving the assessment process.
Criminal Justice 254, Police Management
Where tabulation of the results showed deficiencies, the instructor agreed that the topics
should receive more coverage in lectures in future classes.
Developmental Reading 105
"The faculty concluded that the assessment was both beneficial and revealing." The
reading faculty instituted several changes:
1. Since they teach how to annotate text in the course, they should allow it on the
assessment instrument and to classroom tests and quizzes, even though it adds to
higher duplicating costs.
2. The assessment results showed that students were better at finding the main idea in
short, single paragraphs than in longer passages; the faculty will give students more
practice with multi-paragraph selections.
3. Students need more work with word parts (prefixes, suffixes, and root words).
Economics 103, Principles of Economics I
The discussion of the assessment results by the three full-time economics faculty
identified items on which students scored poorly and reworded some of the test questions.
"Our discussion focused, however, on how to change the way we teach certain concepts,
including giving more examples and as take-home exercises. We also decided to include
more items for assessing each outcome."
6
Education 210, Processes and Acquisition of Reading
The assessment process revealed weaknesses in one instructor's testing methods and
inspired a fruitful discussion between the faculty member and the department chair. "A
positive result of this process was a raised awareness on the instructor's part to assess
more of the outcomes throughout the course and not leave the assessment of the essential
material to the end."
English 223, Children's Literature
The results were considered good. Even so, both instructors considered ways of
modifying their courses next time. "The instructors—all relatively new to teaching EGL
223—agree that the assessment process was useful as a way to exchange ideas about
what works and what doesn't work in this class. In addition, the instructors believe that
the assessment gave them valuable insights into what kinds of students take this class and
how they can help them achieve the learning outcomes in the semester.
English as a Second Language 105, Intermediate Reading Skills
The department was satisfied with results, but their discussion generated many questions
about valid assessment. "One concrete departmental change that was agreed upon at the
conclusion of this discussion was that we will make it policy for all adjuncts to submit
their final exams to their Course Coordinators a week prior to administering them."
Health 221, Human Sexuality
The veteran instructor of HLE 221 has been using the college's OPSCAN system since its
inception to improve the validity and reliability of his test questions. Using assessment
results to improve learning and testing has been his continuous practice. Nevertheless,
this analysis of assessment results resulted in minor adjustments in his teaching methods
and an intention to give more smaller exams next semester, covering less content.
History 143, History of the United States II
The faculty participating in the discussion concluded that the assessment results were
satisfactory and saw no need to modify the course, but recognized that the assessment
was limited to one outcome.
Management 160, Principles of Management
The business management faculty teaching the course "found the assessment activities
and results interesting and useful." While "for the most part the results were in line with
faculty expectations," the department used the results to improve learning in seven
specific ways. "As a result of the pilot assessment and discussion or results, the MGT
160: Principles of Management Expected Course Outcomes have been tested, examined,
and revised."
7
Math 119, Probability
A discussion by the faculty resulted in changes in the assessment method, in
improvements in the course syllabus, and in improved communication with fulltime and
adjunct faculty. The following changes were initiated, as quoted in the math department's
report:




"Make the core portion of the final exam count in some way towards a student's final
exam score in the class.
Integrate questions from the core portion of the exam within the structure of the exam
given in class. This would make the questions difficult to identify as core questions.
Create a more detailed department syllabus outlining timing by week of material to be
covered.
Communicate with both full-time and adjunct faculty about the process of assessment
and the results gathered so far in the course."
Music 100, Fundamentals of Music Theory
"The results show that learning outcome number five is not being tested. This deficiency
is being remedied this semester and will continue to be remedied in future presentations
of the course."
Philosophy 109, Practical Logic
The results did not indicate a need to revise the course as it parallels what is taught
nationally. The assessment process did result in a discussion that inspired the teachers to
rename the course, consider using a common textbook in all sections, consult with the
counselors for more appropriate placement in PHL 109, and design a new more practical
course in informal logic.
Physical Science 121, Earth and Space Science Concepts for Pre-k to 8 Teachers
"Based on the results, the instructor has
1. changed the sequence of course topics for better conceptual flow;
2. introduced additional worksheets for practice on skills and concepts for topics such as
topographic map interpretation, weather interpretation/prediction, and use of star
charts;
3. spent additional time discussing the processes of science and de-emphasizing
memorization."
8
Sociology 101, Introduction to Sociology
The department was pleased with the results generally, but saw that several areas needed
attention, quoting from its report:
1. "A low right to wrong ratio for some items has generated a need for the sociology
faculty to revisit and re-analyze those items deemed problematic. We are currently
involved in that process.
2. The results indicate that there is a problem with test item 6. . . . Further scrutiny led us
to conclude that rewording the item for greater clarity would also be an improvement.
3. The sociology faculty agreed that the assessment results would have had greater
validity if there had been more that one test item devoted to measuring each outcome.
Thus we are in the process of developing a test that has multiple test items per course
outcome"
Theatre 101, Introduction to the Theatre
The theatre faculty "reached the conclusion that the course was working well and no
changes needed to be made." At the beginning of the assessment process, the faculty
realized that the textbook they had been using did not address the revised course
outcomes, so they found one that focused more closely on the revised course outcomes.
"Therefore the assessment process was valuable and did result in changes in the course."
*
*
*
*
*
For additional documents about assessment at PGCC, visit the our outcomes assessment web site
at <http://academic.pgcc.edu/assessment>. A copy of this presentation will be placed there soon.
The current coordinator of academic outcomes assessment (started fall 2003) is Mike Gavin at
mgavin@pgcc.edu. Feel free to contact him or me if we can help.
9
Download