Online Senior Assessment 2009: General Report  Table of Contents

advertisement
Online Senior Assessment 2009: General Report Table of Contents
EXECUTIVE SUMMARY .......................................................................................................................... 2 INTRODUCTION ........................................................................................................................................ 4 RESPONDENTS ........................................................................................................................................ 4 RESULTS .................................................................................................................................................. 10 COMPARISON: 2008 vs 2009 ............................................................................................................... 18 LIMITATIONS ........................................................................................................................................... 18 APPENDIX ................................................................................................................................................ 20 Office of Planning and Assessment, September 2009 Page 1 of 41 Online Senior Assessment 2009: General Report EXECUTIVE SUMMARY
In 2009, the final sample size is 704. That means about 10% of the total population of
seniors at Texas Tech are included in this analysis. This sample contains many more
female students than would be expected. On the other hand, the sample seems to
represent the different ethnicities in the senior population very well. There are slightly
more Arts & Science students than would be expected, but overall the sample also
represents the population fairly well in regards to which college the students come from.
Furthermore, the differences between sample and population are statistically significant
at the .05 level for GPA and age. The sample has a slightly higher GPA as well as age.
The differences are not statistically significant for total transfer credits and total credits
earned.
There were a total of 32 so-called knowledge questions from Humanities, Multicultural,
Mathematics, Natural Sciences, and Social and Behavioral Sciences on the Online
Senior Assessment 2009 (i.e., questions that can be answered either correctly or
incorrectly). For 26 of those questions, students who took their course for core
requirement at Texas Tech, performed better than those who took the course
somewhere else. The latter group performed better on only 6 questions. The difference
was statistically significant at the .05 level only in 8 cases. All of these 8 cases are
situations where TTU students outperformed the transfer students.
When averaging the results of the knowledge questions by area first and then
calculating the overall average, the average score is 62.77%. The lowest score is
14.86% and the highest score is 98%. The same results by area can be split up
between students who took the respective courses for core requirement at TTU and
those who took it elsewhere (“TTU” vs “Else”). This should provide some information as
to if students who took the core at Texas Tech perform better or worse on average than
those who took it elsewhere. Running a 2-sample t-test (a statistical test often used to
check for differences in means of two different groups) shows that the means of those
average scores are different from each other (statistically significant at the .05 level).
The mean scores show that participants who took their core at TTU do a little bit better
on average (63.19% for “TTU” vs 60.79% for “Else”). A paired t-test was run for only
those students who have taken some classes for core requirements at Tech and some
elsewhere (382 students). This allows us to compare if the same student performs any
different depending on where the course was taken (at TTU or elsewhere). Again, the
average is slightly higher for the “TTU” group, but the results below show that this
difference is not significantly different at the .05 level.
The average per area was compared between “TTU” and “Else” per core area (including
Humanities, Multicultural, Mathematics, Natural Sciences, Social and Behavioral
Sciences, and VPA). Except for Humanities, the average score is higher when the core
requirement was taken at TTU. This difference is significant in all cases except for
Mathematics. The difference between the slightly higher score for “Else” in Humanities
is not significant as well. This means that “TTU” performs significantly better than “Else”
in Multicultural, Natural Sciences, and Social and Behavioral Sciences (at the .05 level).
Office of Planning and Assessment, September 2009 Page 2 of 41 Online Senior Assessment 2009: General Report Natural Sciences and Mathematics have the lowest average scores while Multicultural
and Humanities have some of the highest average scores. In almost all cases the
variance of performance is higher for the “Else” group. This means that the spread is
wider between bad and good scores for students who took the course for core
requirement elsewhere in almost all areas.
The correlations between overall score of participants and both GPA and total credits
earned are significant at the .05 level. Students who did well in their classes also did
well on this assessment. This connection is by far the strongest. The correlation
between total credits earned and overall percentage of correct answers is positive as
well which means that students who have taken more classes do better on this
assignment than students with fewer courses under their belt. Interestingly the
correlation between performance and transferred credits is negative which would mean
that the more credits a student transfers in, the worse his performance on the OSA.
However, this correlation is not quite significant.
The results of the Online Senior Assessment 2009 are similar to the administration of
the Online Senior Assessment in 2008. However, the response rate was considerably
lower in 2009 (about 10% versus about 20%). This could be due to the shorter surveys
and/or the academic probation of Texas Tech in 2008. When comparing the averages of
each area from 2008 and 2009, it looks like overall the results are very similar. The
biggest change is that the VPA averages seem to have increased for both “TTU” and
“Else”. However, it is important to note that the questions on this area changed, so
maybe they did not capture the same knowledge as they did last year. Another reason
might be that the scoring rubric changed. However, it is also possible that the students
performed better. Overall, students still seem to have the lowest average scores in
Mathematics and the highest scores in the Natural Sciences and the Humanities.
Some limitations include (1) the participants have no incentive to do well, (2) the
participants might not know where they took their courses for core requirement, (3) few
questions per core area, and (4) there is no information about where students took their
courses if they did not take them at Texas Tech.
Office of Planning and Assessment, September 2009 Page 3 of 41 Online Senior Assessment 2009: General Report INTRODUCTION
The Online Senior Assessment (OSA) was designed in 2008 to assess core-curriculum
knowledge and abilities. In 2009, it was administered for the second time to all 7110
graduating seniors (i.e., students having 90 or more credit hours) between 3/4/09 and
4/10/09. A total of 758 students participated leading to an initial response rate of
10.66%. Three of the participants were randomly selected to win $1,500 toward tuition
and fees.
The instrument has one section for each of the following areas:
‐ Humanities (3 questions self-assessment, 4 questions)
‐ Multicultural (7 questions)
‐ Mathematics (5 questions)
‐ Natural Sciences (6 questions)
‐ Technology and Applied Sciences (4 questions)
‐ Social and Behavioral Sciences (10 questions)
‐ Visual and Performing Arts (4 questions)
Please see attachment A for the full instrument.
RESPONDENTS
Inquisite, the survey software used, provides information about when students first
started the survey and when they finally submitted it. The following graph shows how
much time the participants spent on the survey graphically (hours:minutes:seconds)
while the tables below provides some more details (values in seconds).
Office of Planning and Assessment, September 2009 Page 4 of 41 Online Senior Assessment 2009: General Report The minimum time spent on the survey was 2 minutes and 57 seconds and the
maximum time was 10 days 5 hours 58 minutes and 41 seconds. The average time
participants spent on the survey is 2 hours and 57 minutes. However, as the chart
above shows, this number is elevated because of the large outliers to the right. These
outliers in all likelihood were students who started the survey and did not get back to it
until a few days later. Restricting the range 3 hours (in the chart below) shows that most
values actually cluster around 20 to 30 minutes. These participants were not actually
deleted from the analysis. They are just not shown in the graph below which allows it to
show more details on the range most students fall in.
There were 54 participants who took less than 15 minutes for the whole instrument.
They were taken out before any further analysis was undertaken since it takes about as
long just to read through the instrument. Inquisite forces students to choose an answer
before moving on to the next section, so in all likelihood, these students just clicked on
any answer to get to the end where they could submit their name for the chance to win
$1,500 toward tuition and fees. Eliminating these respondents leads to a final sample
size of 704 students (i.e. 9.9%).
Given that the final response rate is only about 10%, it is questionable if it is
representative of the population. A look at the following variables should determine the
answer to this question: gender, ethnicity, college, GPA, transfer credits, total credits
earned, and age.
Office of Planning and Assessment, September 2009 Page 5 of 41 Online Senior Assessment 2009: General Report The charts below compare the number of female and male students of the sample to
the actual population of all seniors with at least 90 credit hours (F = female, M = male,
N = no answer). As can be seen, in the population the split between male and female
students is almost exactly reversed. Therefore, the sample has disproportionately many
females.
Sample Population The charts below compare the distribution of ethnicity of the sample and the actual
population. The sample seems to mirror the population which means that in this regard
our sample seems to be representative of the actual population.
Sample
Population
Office of Planning and Assessment, September 2009 Page 6 of 41 Online Senior Assessment 2009: General Report The charts below compare which college the participants come from to the actual
population. The charts look very similar which points to a good representativeness of
the sample. The only difference that seems fairly large is between students from Arts
and Sciences. There seem to be disproportionately many in the sample.
Sample Population The chart below shows the average (mean), standard deviation (std dev), number of
observations (N), minimum, and maximum for the sample for the following variables:
(1) cumulative GPA,
(2) total transfer credits,
(3) total credits earned, and
(4) age.
The chart below provides the same information for the population.
Office of Planning and Assessment, September 2009 Page 7 of 41 Online Senior Assessment 2009: General Report Sample and population seem to be very similar along those four characteristics. The
average GPA is a little higher in the sample and the participants of the sample have a
lower average of transfer credits. The participants have a slightly higher average
number of total credits earned and are a little bit older.
The charts below compare the distribution of GPA of the sample and the actual
population. The main difference is that the largest group of students in the sample is the
one with the highest GPA (around 3.9) while the largest group in the population is has a
GPA around 3.075.
Sample Population The charts below compare the distribution of total transfer credits of the sample and the
actual population. They look almost identical.
Sample Population Office of Planning and Assessment, September 2009 Page 8 of 41 Online Senior Assessment 2009: General Report The charts below compare the distribution of total credits earned of the sample and the
actual population. It looks like the sample has slightly more people that have already
completed more credit hours than the population.
Sample Population The charts below compare the distribution of age of the sample and the actual
population. It shows that the distributions are very similar.
Sample Population A one-sample z-test (a statistical test often used to compare the sample to the
population when the actual population is known) was performed to see if these
differences are significant (see attachment B for details). The differences between
sample and population are statistically significant at the .05 level for GPA and age. The
differences are not statistically significant for total transfer credits and total credits
earned. The sample size is very large here and so even small differences can be
statistically significant. However, practically speaking, even the statistically significant
differences look very small.
Overall, the sample seems to represent the population fairly well along the chosen
criteria. The biggest differences are between gender and GPA. Some measures should
Office of Planning and Assessment, September 2009 Page 9 of 41 Online Senior Assessment 2009: General Report be put into place to encourage more male students to participate in the assessment
next time. Also, students that do not have top 10% GPA’s should be encouraged to
participate. Maybe there is a way to phrase the invitation in a way to make it less
intimidating to those students who don’t perform at the top of their class.
RESULTS
One of the main questions the Online Senior Assessment can help answer is if students
who take their classes for core requirement at institutions other than Texas Tech (i.e.,
the transfer students) perform similar to students who took their classes for core
requirement at Texas Tech (i.e., the resident students). Whenever those two groups are
compared in the analysis in this report, the group “Else” contains the students who have
transferred in credits for core requirements and the group “TTU” stands for all the
students who took their core requirement courses at Texas Tech.
The data was summarized to show the number of people who had correct and incorrect
answers to each of the 32 knowledge questions from the different areas. A Chi-squared
test for differences in probabilities (a statistical test used to test for differences when the
variables are categorical) was run on each question to test if the difference of
performance between “TTU” and “Else” is statistically significant at the .05 level, i.e. if
there is a relationship between where students took the course for core requirement and
the number of correct answers to the questions (see attachment C for details). The
chart below shows the difference between the percentage of the “TTU” group that had
the correct answer and the percentage of the “Else” group that had the correct answer.
Positive numbers (i.e., the bars to the right) represent all questions where “TTU” had a
higher proportion of correct answers (for 26 questions) and negative numbers (i.e., the
bars to the left) represent where “Else” had the higher proportion of correct answers
(only 6 questions). The columns in red show where the difference was statistically
significant, i.e. where a relationship exists between where the student took the course
and the number of correct answers for that specific question. It is interesting that quite a
few instances where “TTU” performed better are significant while the few instances
where “Else” outperformed “TTU” are all not significant. The instances where the
difference was significant could point to questions that are better at distinguishing
between the two populations than the other questions.
Office of Planning and Assessment, September 2009 Page 10 of 41 Online Senior Assessment 2009: General Report Social and Behavioral Sciences 2
Mathematics 5
Social and Behavioral Sciences 3
Multicultural 5
Natural Sciences 5
Multicultural 6
Natural Sciences 2
Multicultural 2
Social and Behavioral Sciences 1
Mathematics 4
Social and Behavioral Sciences 10
Social and Behavioral Sciences 8
Natural Sciences 3
Natural Sciences 6
Mathematics 1
Natural Sciences 4
Mathematics 2
Multicultural 1
Social and Behavioral Sciences 9
Multicultural 7
Humanities 4
Social and Behavioral Sciences 5
Humanities 3
Natural Sciences 1
Social and Behavioral Sciences 6
Humanities 1
Multicultural 3
Social and Behavioral Sciences 7
Social and Behavioral Sciences 4
Mathematics 3
Multicultural 4
Humanities 2
‐10%
‐5%
0%
5%
Office of Planning and Assessment, September 2009 Page 11 of 41 10%
15%
Online Senior Assessment 2009: General Report The tables below summarize how students performed overall in the different areas (i.e.,
the percentage of correct answers). Only the 32 questions from Humanities,
Multicultural, Mathematics, Natural Sciences, and Social and Behavioral Sciences
where one correct answer exists (i.e., the knowledge questions) were included in this
analysis. The self-assessment questions from Humanities & Technology and Applied
Sciences were excluded along with the VPA questions since neither of those can be
either right or wrong. There were a total of 704 participants and the average score is
61.73% which is disappointing because it would not even be a passing score. The
lowest score is 18.75% and the highest score is 96.88%.
The graph below displays that same information. It shows that most students have
average scores between 60% and 70%. It looks like more students fall to the left of that
which means their scores were actually lower than 60%.
Office of Planning and Assessment, September 2009 Page 12 of 41 Online Senior Assessment 2009: General Report Since each area has a different number of questions, the results above are influenced
more by those areas with more questions (e.g., Social and Behavioral Sciences and
Multicultural) and less by those with fewer questions (e.g., Humanities and
Mathematics). The tables below provide information on how the students performed
when the information is averaged by area first. The average score is slightly higher
(62.77%), the lowest score is lower (14.86%), and the highest score is higher (98%).
The results are almost the same when they are averaged by area first. The graph below
shows that the whole distribution seems very similar even though it is a little wider (both
more lower scores as well as some higher scores). It looks like a few more people have
higher scores when the results are displayed this way.
Office of Planning and Assessment, September 2009 Page 13 of 41 Online Senior Assessment 2009: General Report The same results by area can be split up between people who took the respective
courses for core requirement at TTU and those who took it elsewhere (“TTU” vs “Else”).
This should provide some information as to if students who took the core at Texas Tech
perform better or worse on average than those who took it elsewhere. Since it is
possible for one and the same student to take one course for one area at Texas Tech
(e.g., Humanities) and another course for another area somewhere else (e.g.,
Mathematics), there can be a percentage in both groups for the same student. The
group “TTU” therefore averages the scores of those areas where the core requirement
was taken at Texas Tech while the group ”Else” averages scores of those areas where
the core requirement was taken elsewhere. The two charts below compare the
percentage of correct answers for “TTU” (left) and “Else” (right). While the scores for
“TTU” clearly center around 60% – 70%, the results for “Else” are much less focused.
There seem to be many lower scores as well as some very high ones. The middle is not
nearly as well defined as in the “TTU” group.
“TTU” “Else” Running a 2-sample t-test (a statistical test often used to check for differences in means
of two different groups) shows that the means of those percentages are different from
each other (statistically significant at the .05 level). The mean scores show that
participants who took their core at TTU do a little bit better on average (63.19% for
“TTU” vs 60.79% for “Else”). See below for the details.
Office of Planning and Assessment, September 2009 Page 14 of 41 Online Senior Assessment 2009: General Report Again, due to the sample size even small differences can be statistically significant. The
graph below shows how similar the results are overall though. The lowest point depicts
the lowest average score, the second lowest point (where the blue starts) is the lower
quartile, the third point (in the middle of the blue box) shows the mean, the fourth point
(where the blue ends) shows the upper quartile, and the last point is the maximum
score.
Even though the correlation between the two groups is very low (.16; see attachment
D), there is an obvious violation of independence (participants can have scores in both
groups). Since independence is such an important assumptions when using a 2-sample
t-test, a paired t-test was run for only those students who have taken some classes for
core requirements at Tech and some elsewhere (382 students). This allows us to
compare if the same student performs any different depending on where the course was
taken (at TTU or elsewhere). Again, the average is slightly higher for the “TTU” group,
but the results below show that this difference is not significantly different at the .05
level.
Office of Planning and Assessment, September 2009 Page 15 of 41 Online Senior Assessment 2009: General Report The table below compares the differences between “TTU” and “Else” per core area.
Except for Humanities, the average score is higher when the core requirement was
taken at TTU. This difference is significant in all cases except for Mathematics. The
difference between the slightly higher score for “Else” in Humanities is not significant as
well. This means that “TTU” performs significantly better than “Else” in Multicultural,
Natural Sciences, and Social and Behavioral Sciences (at the .05 level). This could
point to the possibility that taking a course for core requirement in these areas at TTU
(versus taking it elsewhere) helps the students to perform better. However, this
conclusion assumes that the questions in these areas accurately capture the knowledge
that students are supposed to have in these areas.
Core Elsewhere
Core Area
Humanities
Multicultural
Mathematics
Natural Sciences
Social and
Behavioral Sciences
VPA
N
178
148
134
201
208
64
Core at TTU
mean
stdev
N
mean
stdev
0.73736
0.2714
526 0.731464 0.2512
0.72008
0.1899
556 0.758736 0.1775
0.40149
0.22
570 0.441754 0.2479
0.6675
0.2258
503 0.717362 0.1999
0.48894
0.60208
0.1706
0.1743
496
135
0.53004
0.658683
0.1544
0.162
pvalue
0.791
0.0207
0.0846
0.0041
0.0019
0.0258
The following chart shows the mean with a 95% confidence interval for each of the
areas above as well as an added area, Visual and Performing Arts (VPA). The mean is
shown as the line in the little box and the confidence interval is depicted as the little blue
box. For all areas except VPA, the percentage of correct answers is shown while for
VPA the percentage grade averaged for all three questions is shown. For each area the
data is shown for “Else” and “TTU”. Natural Sciences and Math have the lowest average
scores while Multicultural and Humanities have some of the highest average scores. In
almost all cases the variance of performance is higher for the “Else” group (as indicated
by a wider confidence interval). This means that the spread is wider between bad and
good scores for students who took the course for core requirement elsewhere in almost
all areas. The “TTU” group performs much more coherently which could point to the
possibility of the education at TTU in the different courses being of similar standards
regardless which course was taken for the core. Again, this assumes the questions
accurately capture knowledge students are supposed to achieve in the various core
areas.
Office of Planning and Assessment, September 2009 Page 16 of 41 Online Senior Assessment 2009: General Report VPA Else
VPA TTU
Social and Behavioral Sciences Else
Social and Behavioral Sciences TTU
Natural Science Else
Natural Sciences TTU
Mathematics Else
Mathematics TTU
Multicultural Else
Multicultural TTU
Humanities Else
Humanities TTU
0.00%
20.00%
40.00%
60.00%
80.00%
100.00%
The table below shows the correlation between the overall score of participants with
some of the variables used to determine if the sample was representative. The
correlations are displayed below. The correlations with both GPA and total credits
earned are significant. Students who did well in their classes did so as well on this
assessment. This connection is by far the strongest of all four variables. The correlation
between total credits earned and overall percentage of correct answers is positive as
well which means that students who have taken more classes do better on this
assignment than students with fewer courses under their belt. Interestingly the
correlation between performance and transferred credits is negative which would mean
that the more credits a student transfers in, the worse his performance on the OSA.
However, this correlation is not quite significant.
Office of Planning and Assessment, September 2009 Page 17 of 41 Online Senior Assessment 2009: General Report COMPARISON: 2008 vs 2009
The response rate in 2008 was about 20% which is much higher than 2009’s 10%. The
incentives were the same and the survey was open almost 2 weeks longer in 2009 than
in 2008. In 2008, students were invited to participate in the end of February while in
2009 invitations went out 5 days later (in the beginning of March). This difference is not
so large that it should cause such a difference. Spring Break was March 16-20 in 2009
while in 2008 it was March 17-21. So, the difference there is minimal again. One of the
obvious differences is the length of the survey. Since students only answered questions
from two separate subject areas in 2008 as opposed to questions from all areas in
2009, the instrument was much longer in 2009. The estimated time to take it might have
scared some students off that would have otherwise participated. Another reason could
be that when this assessment was offered in 2008, students were very aware that
Texas Tech was on academic probation by SACS-COC (Southern Association of
Colleges and Schools – Commission on Colleges). They more than likely felt more
pressure to help Texas Tech with its assessment efforts.
When comparing the number of times “TTU” outperformed “Else” on the different
questions in the 2008 and 2009, it is interesting to note that while in both years “Else”
performed better than “TTU” on exactly 6 questions, these questions were not actually
the same. In fact, some are even from different areas. However, this is not surprising
since none of these differences were significant in 2009.
When comparing the averages of each area from 2008 and 2009, it looks like overall
the results are similar. The biggest change is that the VPA averages seem to have
increased for both “TTU” and “Else”. However, it is important to note that the questions
on this area changed, so maybe they did not capture the same knowledge as they did
last year. Another reason might be that the scoring rubric changed. However, it is also
possible that the students performed better. Overall, students still seem to have the
lowest average scores in Mathematics and the highest scores in the Natural Sciences in
the Humanities.
LIMITATIONS
There seem to be two major limitations with the way the Online Senior Assessment is
administered. First of all, students really have no incentive to do well. And as the
analysis of how much time students spend on the assessment shows, many students
seem to click through rather fast and do not seem to take much time to read through the
questions and think about the answers. It would be very helpful if in the future this could
be changed somehow. The incentive structure could be changed for example. While
now, every participant is entered into a drawing for the tuition waver, it could be
structured to where only the students who pass a certain threshold (e.g., a “passing
grade” of 70%) are entered into the drawing. Until this is changed, there is no way of
knowing if students did better because they know more or because they cared more
about giving the right answer.
Office of Planning and Assessment, September 2009 Page 18 of 41 Online Senior Assessment 2009: General Report The second major limitation seems to be the assumption that students actually know
where they took the classes for the different areas of the core curriculum. While this is
easy for students who transferred to TTU after meeting all their core requirements or for
students who took all their classes at TTU, a large group of the students take some
classes at TTU and some classes at other institutions. It is questionable if they know
which class counted for core curriculum requirements and which class was just part of
their regular curriculum. To expect students to be able to self-identify into the groups
“TTU” or “Elsewhere” is a very important assumption since all the results are based on
correct selection. If students are not able to select the correct group, all conclusions
drawn concerning which group does better are not valid. One potential improvement
could be to ask leading questions as opposed to only one question. For example, the
students could be asked “Have you taken any classes at any other institution than
Texas Tech?”. If the answer is yes, the next question could be “Which area was this
class in?”. The answer choices could be for example biology, physics, theater, etc. The
next question could be “Did any of these classes count toward your core requirement?”.
Most importantly however, the answer options to this question should include an “I don’t
know”. That way students at least have the option of saying that they cannot provide us
with this information.
Another potential limitation is that there are very few questions per core area and those
do not necessarily represent each subject a student could have taken as part of the
core curriculum For example, the questions in Natural Sciences seem very focused on
biology. A student could have simply taken other classes that are part of Natural
Sciences and performed badly on this assessment. Also, the group “Else” is far from
homogeneous. Students in this group could have taken their classes anywhere from a
small community college to another major institution. It might be worthwhile to explore
the differences in this group further.
Office of Planning and Assessment, September 2009 Page 19 of 41 Online Senior Assessment 2009: General Report APPENDIX
Attachment A: The Instrument
Office of Planning and Assessment, September 2009 Page 20 of 41 Online Senior Assessment 2009: General Report Office of Planning and Assessment, September 2009 Page 21 of 41 Online Senior Assessment 2009: General Report Office of Planning and Assessment, September 2009 Page 22 of 41 Online Senior Assessment 2009: General Report Office of Planning and Assessment, September 2009 Page 23 of 41 Online Senior Assessment 2009: General Report Office of Planning and Assessment, September 2009 Page 24 of 41 Online Senior Assessment 2009: General Report Office of Planning and Assessment, September 2009 Page 25 of 41 Online Senior Assessment 2009: General Report Office of Planning and Assessment, September 2009 Page 26 of 41 Online Senior Assessment 2009: General Report Office of Planning and Assessment, September 2009 Page 27 of 41 Online Senior Assessment 2009: General Report Office of Planning and Assessment, September 2009 Page 28 of 41 Online Senior Assessment 2009: General Report Office of Planning and Assessment, September 2009 Page 29 of 41 Online Senior Assessment 2009: General Report Office of Planning and Assessment, September 2009 Page 30 of 41 Online Senior Assessment 2009: General Report Office of Planning and Assessment, September 2009 Page 31 of 41 Online Senior Assessment 2009: General Report Office of Planning and Assessment, September 2009 Page 32 of 41 Online Senior Assessment 2009: General Report Office of Planning and Assessment, September 2009 Page 33 of 41 Online Senior Assessment 2009: General Report Office of Planning and Assessment, September 2009 Page 34 of 41 Online Senior Assessment 2009: General Report Office of Planning and Assessment, September 2009 Page 35 of 41 Online Senior Assessment 2009: General Report Office of Planning and Assessment, September 2009 Page 36 of 41 Online Senior Assessment 2009: General Report Attachment B: Sample Statistics and Z-Tests
Office of Planning and Assessment, September 2009 Page 37 of 41 Online Senior Assessment 2009: General Report Office of Planning and Assessment, September 2009 Page 38 of 41 Online Senior Assessment 2009: General Report Attachment C: Knowledge Questions
n=
178
n=
526
Else
correct
TTU
incorrect
correct
incorrect
Chi-Square Probability
Humanities 1
116
62
348
178
0.8094
Humanities 2
94
84
242
284
0.1163
Humanities 3
157
21
471
55
0.6181
Humanities 4
158
20
478
48
0.41
n=
148
n=
556
Else
correct
TTU
incorrect
correct
incorrect
Chi-Square Probability
Multicultural 1
100
48
393
163
0.4621
Multicultural 2
129
19
528
28
0.0007
Multicultural 3
140
8
520
36
0.6329
Multicultural 4
71
77
255
301
0.6474
Multicultural 5
100
48
424
132
0.0312
Multicultural 6
100
48
421
135
0.0445
Multicultural 7
106
42
412
144
0.5433
n=
134
n=
570
Else
correct
TTU
incorrect
Chi-Square Probability
Mathematics 1
46
incorrect
88
218
352
0.3993
Mathematics 2
95
39
423
147
0.4335
Mathematics 3
62
72
253
317
0.6933
Mathematics 4
35
99
180
390
0.2169
Mathematics 5
31
103
185
385
0.0353
n=
201
correct
n=
503
Else
correct
TTU
incorrect
correct
incorrect
Chi-Square Probability
Natural Sciences 1
88
113
226
277
0.7817
Natural Sciences 2
86
115
255
248
0.0579
Natural Sciences 3
134
67
357
146
0.2611
Natural Sciences 4
170
31
444
59
0.185
Natural Sciences 5
156
45
434
69
0.0048
Natural Sciences 6
171
30
449
54
0.1214
Office of Planning and Assessment, September 2009 Page 39 of 41 Online Senior Assessment 2009: General Report n=
208
n=
496
Else
correct
TTU
incorrect
correct
incorrect
Chi-Square Probability
Social and Behavioral Sciences 1
115
93
305
191
0.1258
Social and Behavioral Sciences 2
136
72
395
101
<.0001
Social and Behavioral Sciences 3
71
137
213
283
0.0297
Social and Behavioral Sciences 4
21
187
41
455
0.4344
Social and Behavioral Sciences 5
80
128
201
295
0.6101
Social and Behavioral Sciences 6
112
96
272
224
0.8093
Social and Behavioral Sciences 7
62
146
139
357
0.6326
Social and Behavioral Sciences 8
103
105
269
227
0.2529
Social and Behavioral Sciences 9
199
9
489
7
0.0179
Social and Behavioral Sciences 10
118
90
305
191
0.2392
Office of Planning and Assessment, September 2009 Page 40 of 41 Online Senior Assessment 2009: General Report Attachment D: Correlation “TTU” and “Else”
Office of Planning and Assessment, September 2009 Page 41 of 41 
Download