Report - Stevens Institute of Technology

advertisement
Stevens BS-CS Program Evaluation Report for
2008-09
The report is divided into two sections, process and results. The process section evaluates the
process used to assess the program. The results section evaluates the extent to which the
program is meeting its goals (i.e., objectives and outcomes). Each section contains two
subsections: discussion and planned improvements.
1. Process
1.1. Discussion
Last year's report identified the following weaknesses in our assessment process:
1. We did not have program objectives.
2. We did not survey alumni to gather their opinion about how well the program established
its outcomes. The effort of creating outcomes and convincing instructors to assess their
courses was great enough that we did not have energy to tackle this requirement.
3. Our approach to creating outcomes was wrongly bottom-up instead of top-down. We
realized only after finishing course and program outcomes that we should have used a
sequential process: first define program objectives, then define program outcomes in
support of the objectives, then define course outcomes in support of program outcomes.
We instead neglected objectives altogether and define program and course outcomes
virtually in isolation from one another.
4. Some program outcomes receive light coverage in the curriculum. For example, outcome
#9 (ABET H) and outcome #11 (ABET G) are each covered by a single instrument in a
single course. While this may be sufficient, it raises the danger that a single change in
course delivery might leave the outcome uncovered in that year. Further, coverage by a
single instrument in a single course raises the question of whether the program outcome
is taught in sufficient depth.
Accordingly, in last year's report we wrote this plan for addressing each of the above process
weaknesses:
1. In concert with stakeholders in the program, we will define program objectives.
2. We will survey one-year-out alumni regarding outcomes.
3. We will survey three-years-out alumni regarding objectives. We choose to survey about
outcomes after one year since graduates are more likely to have technical positions that
test their ability in areas covered by our program outcomes. We choose to survey about
objectives after three years because objectives are more high-level concepts best
evaluated after gaining some experience.
4. We will survey employers of alumni regarding objectives. Alumni and their employers
are the primary stakeholders in the program.
This year we partially addressed these weaknesses. Most notably, we developed a set of
program objectives, available at
http://www.stevens.edu/compsci/accreditation/bs_cs_prog_obj.html. Three inputs went into the
development of these 5 objectives. First, our own thoughts about what type of graduates we
wish to produce. Second, we surveyed ABET objectives published by other departments around
the nation. Finally, we discussed candidate objectives with our IAB (Industrial Advisory Board)
in October. The IAB consists of about 6 representatives from companies who regularly hire our
graduates. Since the vast majority of our graduates take jobs immediately after graduation,
primary stakeholders are the graduates and their employers. The IAB serves to represent the
employers. We hold one IAB meeting per year in the late September or early October time
frame. We developed candidate objectives before the IAB meeting, discussed them at the
meeting, made some alterations based on IAB comments, then published them. Our objectives
are intended to describe qualities graduates should exhibit a few years after graduation. We kept
the set small to encourage alumni and employers to respond to our surveys.
We also succeeded in surveying employers about these objectives. Reaching employers is
difficult for several reasons. First, even if we know where a graduate is employed we seldom
know the name or contact information of the alumnus/alumna's direct supervisor or anyone in a
position to evaluate the abilities of Stevens alumni at that employer. Second, graduates have
proven (understandably) reluctant to give us access to their supervisors. Third, employers with
whom we have discussed the matter, such as IAB members, indicate that their organizations
invariably have strict rules governing the privacy of any information that is or might be
construed to be employee performance evaluation information. We will continue to try to reach
people who are sufficiently highly placed within their organizations that they are comfortable
releasing "average" evaluations of Stevens graduates whom they have known.
We did not succeed in surveying alumni regarding either outcomes or objectives (points 2 and 3
above).
We also failed to assess quantitatively outcome #11 (ABET G), societal impact. The reason is
that the instructor of the "Computers and Society" course did not maintain student performance
data that could populate a SPAD. She did teach material regarding the local and global impact of
computing, and she did assess student performance, but she did not retain the needed records.
1.2. Planned Improvements in Assessment Process
For the coming school year, we plan once again to survey employers regarding objectives. We
also plan to "finish" the survey portion of our assessment process by successfully conducting
surveys of alumni. One-year-out alumni will be surveyed regarding program outcomes and
three-year-out alumni will be surveyed regarding objectives. How to locate alumni is an issue.
We expect that, thanks to the rise of social networking, this will be much easier than it was even
a few years ago. We expect to be able to find a majority of our graduates in networks such as
LinkedIn or Facebook.
Regarding out failure to assess all ABET-required outcomes, for the coming year, the program
director will contact every instructor of a required course before the beginning of each semester
to ensure that the instructor plans to assess enough course outcomes so that the overall program
is fully assessed; i.e., so that there are no "uncovered outcomes" as there was one this year.
2. Results
2.1. Discussion
Last year's report identified the following issues with educational results:
1. Two outcomes have much lower SPAD scores than all the others. Outcome #11 (ABET
G) "Analyze the local and global impact of computing on individuals, organizations and
society" is at only 55.6% whereas outcome #14 "Describe network environments
including the protocol stacks underlying internet and Web applications, and develop and
run distributed programs using sockets and RPC" is at only 50.8%
2. For several outcomes, there is substantial disagreement between the SPAD scores and the
scores from the senior exit survey. In 6 cases, the student score is less than the SPAD
score by double digits.
3. The light coverage (only one instrument across the entire curriculum) of outcomes #9
(ABET H) and #11 (ABET G).
Accordingly, in last year's report we wrote the following plan to address each of the above issues
as follows:
1. Work with instructors of the "Computers and Society" course in the Institute's humanities
college. This course is used to establish outcome #11 that in 2007-08 had a measured
success rate of only 55.6%. The curriculum committee will explore whether the problem
is with the material, the assessment instrument, the instruction, or something else.
2. Work with the instructor of CS 135 regarding outcome #14 (only 50.8% success).
3. Monitor the situation of substantially different scores from direct measurement and
student opinion in the senior exit survey. While there are several cases where the
students evaluated the program lower than indicated by direct measurement, there are
also some cases where the student score was substantially higher. We don't know
whether these differences have meaning or are somewhat random.
4. We saw no simple and logical way to alter the curriculum to increase coverage of lightly
covered outcomes such as outcome #9 (ABET H) and outcome #11 (ABET G).
Therefore, we will monitor this point in future years.
The summary SPAD data (from http://www.stevens.edu/compsci/accreditation/200809/Program_Year0809_Summary.xls) is shown below:
NUMBER
NUMBER
NUMBER
PROGRAM OUTCOME
ASSESSED UNACCEPTABLE ACCEPTABLE
1
104
7
2
301
46
3
817
132
4
355
117
5
123
13
6
217
39
7
56
0
8
199
50
9
18
1
10
28
0
14
54
15
16
176
30
18
392
114
TOTAL
2840
564
NUMBER PERCENT ACCEPTABLE OR
HIGH
HIGH PERFORMANCE
52
45
93.26923077
153
102
84.71760797
293
367
80.78335373
91
141
65.35211268
92
18
89.43089431
104
74
82.02764977
18
38
100
68
91
79.89949749
4
13
94.44444444
5
23
100
29
10
72.22222222
88
58
82.95454545
112
166
70.91836735
1109
1146
79.40140845
This year there are no glaringly low numbers as there were last year, although outcome #4
(ABET A) stands out as being the only number below 70%. With that exception, all outcomes
that map to ABET A-K have scores almost 80% or higher and most are higher than last year.
Therefore, we consider 2008-09 a qualified success even though the overall number is down,
81.4% to 79.4%. This year we performed almost 40% more assessments (2840 to 2042), which
may make it harder to achieve higher numbers. Table below compares 2007-08 performance to
2008-09 performance for ABET outcomes only:
OUTCOME
A (4)
B (1)
C (16)
D (7)
E (10)
F (8)
G (11)
H (9)
I (5)
J (2)
K (3)
2007-08 SUCCESS
RATE
86.7%
91.6%
89.0%
75.8%
70.3%
78.8%
55.6%
90.0%
79.6%
79.1%
82.7%
2008-09 SUCCESS
RATE
65.4%
93.3%
83.0%
100%
100%
79.9%
Not assessed
94.4%
89.4%
84.7%
80.8%
DIFFERENCE
-21.3%
1.7%
-6.0%
24.2%
29.7%
1.1%
N/A
4.4%
9.8%
5.6%
-2.1%
Obviously, outcome #4 (ABET A) stands out as a concern. This outcome is assessed in four
courses, so there is no simple explanation for the big drop.
Last year there was a noticeable difference between outcome success as measured by SPAD data
and as measured in the senior exit survey. This year the difference remains and accelerates. The
table below shows that graduating seniors have a lower opinion of the success of outcome
achievement in 11 of the 13 outcomes assessed. Last year we were unsure if this phenomenon
was an anomaly; this year we are sure it is not.
AVERAGE
STUDENT
RESPONSE
PROGRAM OUTCOME
1 (ABET B)
2 (ABET J)
3 (ABET K)
4 (ABET A)
5 (ABET I)
6
7 (ABET D)
8 (ABET F)
9 (ABET H)
10 (ABET E)
11 (ABET G)
12
13
14
15
16 (ABET C)
17
18
19
OVERALL AVERAGE
80
83
83
58
73
71
83
81
87
71
76
80
73
65
68
75
76
67
79
75.21052632
STUDENT NUMBER
MINUS
SPAD NUMBER
-13.3
-1.7
2.2
-7.3
-16.4
-11
-17
1.1
-7.4
-29
-7.2
-7.9
-3.9
2.2. Planned Improvements in Curriculum and Teaching
The first priority for next year is to ensure that all ABET-required outcomes are assessed. To do
this, the program director will speak with instructors prior to their courses to ensure that enough
course outcomes will be assessed to cover all the program outcomes. This is really a process
improvement, and it is mentioned above in section 1.2. But since it bears on what happens in the
classroom, we mention it here also. We discussed amending the PCR form to require instructors
to state which outcomes they will assess, but decided against that. The assessment process is still
new and challenging to many faculty so we prefer not to make small changes.
To home in on why there is such a divergence between the opinions of graduating seniors and
SPAD data, we will gather per-course student opinions. SPAD data is per-outcome and per-
course whereas the senior exit survey asks seniors to look back and give an opinion about the
program as a whole. Thus, the two measures are incomparable. For 2009-10 we will augment
the online "Dean Russ" surveys administered via ACE to ask students in each course to rate, for
each course outcome, to what extent they feel the course established the outcome. This will
place a significant burden on the students since many courses have more than a dozen outcomes,
but having per-outcome student opinions from the students who just completed the course will
hopefully allow us to zero in on which areas students feel we are failing them.
Outcome #4 (math-stat, ABET A) is a problem. The SPAD evaluation is only 65% and the
student evaluation is even lower, 58%. One of the courses that is key in establishing this
outcome is CS 135, Discrete Structures. This course is commonly taken in Spring of the
freshmen year and has been the subject of student complaints. In particular, students complain
that learning Scheme is a challenge. Accordingly, for 2009-10 we decided to add a lab section to
this course. The two-hour lab will be a time when students can work on their programming
assignments with an instructor present.
Download