Collegiate Assessment of Academic Proficiency: Science Test Fall 2010 Administration

advertisement
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Collegiate Assessment of Academic Proficiency:
Science Test
Fall 2010 Administration
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 1 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Table of Contents
Executive Summary....................................................................................................................................... 3
Introduction .................................................................................................................................................. 4
Sample....................................................................................................................................................... 5
Results ........................................................................................................................................................... 8
CAAP Scores Overall .................................................................................................................................. 8
CAAP Scores by Sex, Major, GPA, and Motivation .................................................................................. 10
Additional Questions Scores Overall....................................................................................................... 13
Additional Questions Scores by Sex, Major, GPA, and Motivation......................................................... 14
Relationship between CAAP Scores and Additional Questions Scores ................................................... 16
Conclusion ................................................................................................................................................... 19
Appendix ..................................................................................................................................................... 20
Appendix A: Information about the CAAP Science Test ......................................................................... 20
Appendix B: Copy of the Additional Questions....................................................................................... 22
Appendix C: Analysis of CAAP Scores by Sex, Major, GPA, and Motivation ........................................... 26
Appendix D: Analysis of Additional Questions Scores by Sex, Major, GPA, and Motivation .................. 28
Appendix E: CAAP Scores and Additional Questions Scores by Student Classification .......................... 30
Appendix F: CAAP Scores and Additional Questions Scores by Sex, Major, and Minor ......................... 31
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 2 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Executive Summary
The Collegiate Assessment of Academic Proficiency (CAAP) Science Test is designed to measure
skills in scientific reasoning. In coordination with the various Texas Tech University (TTU)
Colleges, the Office of Planning and Assessment administered the Science Test in select courses
across campus at the beginning of the fall 2010 semester. Courses were chosen based on their
enrollment of juniors and seniors to achieve a representative number from each College. ACT
allows the administering institution to add nine additional questions to the chosen CAAP
module. TTU chose to add the six Natural Sciences questions from the Online Senior
Assessment (OSA) plus three newly developed Natural Sciences questions to allow for a
comparison that could help validate the Natural Sciences questions from the OSA.
A total of 495 Texas Tech University (TTU) students participated in the fall 2010 administration
of the CAAP Science Test. The answers of 400 participants were sent to ACT for scoring. A final
sample of 331 participants (161 juniors and 170 seniors) is included in this analysis. The final
sample seems to be a fairly good representation of the population of all TTU juniors and seniors
in terms of sex, ethnicity, and college.
Looking at the CAAP scores overall it appears that the TTU juniors (mean = 61.4, SD = 4.1) score
similarly on average to the national sample of juniors (mean = 61.4, SD = 4.7) and that the TTU
seniors (mean = 61.3, SD = 4.1) score higher on average than the national sample of seniors
(mean = 60.7, SD = 4.8) on the Science Test. There were statistically significant differences in
CAAP scores by sex, major, motivation. Male participants score higher on average than female
participants. Natural Science majors score higher on average than other majors. Participants
reporting more motivation score higher on average than other participants. There was not a
statistically significant difference in CAAP scores by GPA.
For the additional questions scores there was a statistically significant difference in scores by
major. Natural Science majors score higher on average than other majors. There was not a
statistically significant difference in additional questions scores by sex or GPA. The
interpretation of the additional questions scores needs to consider that these questions were
administered at the end of participants’ testing. The patterns in additional questions scores are
likely influenced by participants potentially rushing through this last part of the test.
The analysis found a correlation between the CAAP scores and the additional questions scores
of 0.34 (p value < 0.0001). This correlation suggests that there is a significant relationship
between the additional questions scores and CAAP scores. This relationship may suggest that
the CAAP and the additional questions are measuring similar domains of knowledge or that
they are in some other way related. The results suggest that the nine additional questions
might be a better predictor of CAAP scores than the first six questions alone (R2 of 0.1198 for
the nine questions and 0.0883 for the six questions).
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 3 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Introduction
The Collegiate Assessment of Academic Proficiency (CAAP) “is the standardized, nationally
normed assessment program from ACT that enables postsecondary institutions to assess,
evaluate, and enhance student learning outcomes and general education program outcomes.” 1
The CAAP offers six different modules: Reading, Writing Skills, Writing Essay, Mathematics,
Science, and Critical Thinking. 2 As per decision of the University Assessment Committee in April
2010, the Office of Planning and Assessment administered the Science Test of the CAAP during
fall 2010. The Science Test “is a 45-item, 40-minute test designed to measure students' skills in
scientific reasoning. The contents of the Science Test are drawn from biological sciences (e.g.,
biology, botany, and zoology), chemistry, physics, and the physical sciences (e.g., geology,
astronomy, and meteorology). The test emphasizes scientific reasoning skills rather than recall
of scientific content or a high level of skill in mathematics or reading.” 3 Please find more
information on how the test is constructed and scored in Appendix A.
ACT allows the administering institution to add nine additional questions to the chosen module.
The Core Curriculum Committee (CCC) of Texas Tech University (TTU) has been working on its
own instrument to measure student abilities in various core areas, the Online Senior
Assessment (OSA). The OSA was designed in 2008 and has been administered to graduating
seniors from Texas Tech during the spring semester of 2008, 2009, and 2010. During the
summer of 2010 the same assessment was administered to incoming transfer students at Texas
Tech as the Transfer Student Assessment (TSA). The instrument has one section for each of the
following areas: Humanities, Multicultural, Mathematics, Natural Sciences, Technology and
Applied Science, and Social and Behavioral Sciences. The Natural Sciences section of the OSA is
designed to measure abilities that are also measured by the CAAP Science Test. Since the
Science Test is an established instrument it was decided to use the original six Natural Sciences
questions from the OSA along with three newly developed Natural Sciences questions to start
establishing validity for the OSA (see Appendix B for a copy of the additional questions).
In coordination with the various Colleges, the Office of Planning and Assessment administered
the Science Test in select courses across campus. Courses were chosen based on their
enrollment by student classification and size. As juniors and seniors were the target group,
courses with high junior and senior enrollment were selected to participate. In order to
represent the Colleges equally larger courses were chosen from larger Colleges and smaller
courses were chosen from smaller Colleges. Included in the sample were courses from the
following Colleges: College of Agricultural Sciences and Natural Resources, College of
Architecture, College of Arts and Sciences, College of Education, College of Human Sciences,
College of Mass Communications, College of Visual and Performing Arts, Edward E. Whitacre Jr.
College of Engineering, Honors College, and Jerry S. Rawls College of Business Education.
1
http://www.act.org/caap/, accessed 11/29/10
http://www.act.org/caap/test_modules.html, accessed 11/29/10
3
http://www.act.org/caap/test_science.html, accessed 11/29/10
2
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 4 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
A total of 495 Texas Tech University (TTU) students participated in the fall 2010 administration
of the Science Test. The Office of Planning and Assessment identified 17 irregularities (e.g.,
pattern scoring) that were exempt from scoring. In order to have a proportionate
representation from each College, another 78 participants’ answer sheets were not selected for
scoring. The remaining 400 answer sheets were scored by ACT.
Sample
ACT provided scores for the 400 answer sheets of students who participated in the fall 2010
administration of the CAAP. Since the students’ R numbers were used as the student ID on the
Science Test, the Office of Planning and Assessment was able to obtain demographic
information for the participants from Institutional Research and Information Management
(IRIM). The demographic information obtained includes sex, ethnicity, student classification,
and college. Since the target population for the assessment consisted of TTU juniors and
seniors, the 69 participants of other student classifications (3 freshmen, 59 sophomores, 3
undergraduate second degree students, 2 graduate masters, 1 graduate doctoral, and 1
unreported) were excluded from this analysis for a final sample of 331 participants (161 juniors
and 170 seniors). The following graphs compare the final junior and senior samples to the
population of all TTU juniors or seniors by sex, ethnicity, and college.
Sample and Population by Sex:
Juniors
70.0%
60.0%
50.0%
40.0%
30.0%
20.0%
10.0%
0.0%
Sample and Population by Sex:
Seniors
60.0%
50.0%
40.0%
Sample
30.0%
Sample
Population
20.0%
Population
10.0%
0.0%
Female
Male
Female
Male
The samples of both the juniors and seniors appear to be good representations of their
respective populations in terms of sex.
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 5 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Sample and Population by Ethnicity: Juniors
90.0%
80.0%
70.0%
60.0%
50.0%
Sample
40.0%
Population
30.0%
20.0%
10.0%
0.0%
AI
AS
B
HI
M
NR
U
WH
Sample and Population by Ethnicity: Seniors
80.0%
70.0%
60.0%
50.0%
40.0%
Sample
30.0%
Population
20.0%
10.0%
0.0%
AI
AS
B
HI
M
NR
U
WH
The samples of both juniors and seniors appear to be good representations of their respective
populations in terms of ethnicity.
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 6 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Sample and Population by College: Juniors
35.0%
30.0%
25.0%
20.0%
Sample
15.0%
Population
10.0%
5.0%
0.0%
AG
AR
AS
BA
ED
EN
HR
HS
MC
UC
UN
VP
Sample and Population by College: Seniors
40.0%
35.0%
30.0%
25.0%
20.0%
Sample
15.0%
Population
10.0%
5.0%
0.0%
AG
AR
AS
BA
ED
EN
HR
HS
MC
UC
UN
VP
It appears that for the juniors the Rawls College of Business, the Whitacre College of
Engineering, and the College of Visual and Performing Arts were somewhat overrepresented,
and that the College of Arts and Sciences, the College of Education, and the College of Human
Sciences were somewhat underrepresented. It appears that for the seniors the College of Arts
and Sciences and the College of Visual and Performing Arts were overrepresented, and that the
College of Education was underrepresented. The lack of participants from the College of
Education is likely due to the number of students from other colleges that take courses in the
College of Education for teaching certification. Students who are seeking teaching certifications
are categorized according to their “home” college. For example, an undergraduate student
seeking certification in Secondary Education might be a History major, and thus be counted
under the College of Arts and Sciences.
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 7 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Results
CAAP Scores Overall
The following table and graph give a summary of the juniors’ and seniors’ CAAP scores.
Summary of CAAP Scores
Juniors
Seniors
n
Mean
SD
Min
Median
Max
161
61.4
4.1
52
62
70
170
61.3
4.1
51
61.5
71
Histogram of CAAP Scores
70
59
60
60
65
58
50
40
32
30
Juniors
30
20
12
Seniors
13
10
2
0
45
50
55
60
65
70
75
80
Note that the TTU juniors (mean = 61.4, SD = 4.1) and seniors (mean = 61.3, SD = 4.1)
performed similarly on the CAAP. The ACT website has a summary of the scores from 10,331
juniors and 8,237 seniors from 98 institutions that participated in the Science Test in fall 2010.4
On average the TTU juniors scored similarly to the national sample of juniors (mean = 61.4, SD =
4.7).5 On average the TTU seniors scored just above the national sample of seniors (mean =
60.7, SD = 4.8).6
4
http://act.org/caap/norms/, accessed 11/29/10
http://act.org/caap/norms/pdf/10Table12.pdf, accessed 11/29/10
6
http://act.org/caap/norms/pdf/10Table13.pdf, accessed 11/29/10
5
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 8 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
The following table shows the frequency of TTU juniors and seniors that received each score.
The table also shows national percentiles for each of these scores for the juniors and seniors
that were obtained from the ACT website.7 For each score, the national percentile shows the
percentage of junior or senior participants from the national sample that scored at or below
that score. For example, a score of 62 has a national percentile for juniors of 58 and a national
percentile for seniors of 54. This means that 58% of the national sample of juniors scored at or
below this score and that 54% of the national sample of seniors scored at or below this score.
National Percentiles for CAAP Scores
Score Juniors Natl. % Seniors Natl. %
71
70
69
68
67
66
65
64
63
62
61
60
59
58
57
56
55
54
53
52
51
0
4
1
7
5
15
7
10
17
23
1
15
14
15
3
12
5
5
1
1
0
99
98
96
93
89
85
79
72
66
58
48
41
34
28
22
17
13
8
5
3
1
2
2
2
4
5
17
7
11
15
20
12
16
13
15
8
8
3
6
1
1
2
99
99
96
93
89
85
78
71
64
54
45
39
32
26
21
16
12
8
5
3
1
Over half of the TTU juniors (89 participants, 55.2%) and over half of the TTU seniors (97
participants, 57.1%) scored above the 50th percentile nationally.
7
http://act.org/caap/norms/pdf/10Table12.pdf, http://act.org/caap/norms/pdf/10Table13.pdf, accessed
11/29/10
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 9 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
CAAP Scores by Sex, Major, GPA, and Motivation
The following table shows the summary statistics for CAAP scores by sex and major (see
Appendix F for additional information).
CAAP Scores by
Sex and Major
Mean
SD
Min
Median
Max
All Participants
All Females Males
61.4
60.6
62.0
4.1
3.8
4.2
51.0
52.0
51.0
62.0
60.0
62.0
71.0
70.0
71.0
Natural Science Majors
All
Females Males
63.8
64.4
63.5
3.9
3.3
4.2
54.0
59.0
54.0
64.0
64.0
64.5
70.0
70.0
70.0
All
61.0
4.0
51.0
61.0
71.0
Other Majors
Females Males
60.3
61.7
3.6
4.2
52.0
51.0
60.0
62.0
70.0
71.0
The following graphs show the average CAAP scores for TTU juniors and seniors by sex, major,
GPA, and participants’ self report of how motivated they were in taking the assessment.
Average CAAP Scores by Sex
64.0
62.0
62.1
60.6
60.6
61.9
60.0
58.0
Juniors
56.0
Seniors
54.0
52.0
50.0
Female
Male
For both the juniors and seniors the male participants scored higher on average than the
female participants. While not large, these differences by sex are statistically significant at the
0.05 level (see Appendix C for details). This suggests that on average male TTU juniors and
seniors score better than female TTU juniors and seniors on the Science section of the CAAP.
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 10 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Average CAAP Scores by Major
64.3
64.0
62.0
60.0
58.0
56.0
54.0
52.0
50.0
63.4
61.1
61.0
Juniors
Seniors
Natural Science
Major
Other Major
For both the juniors and the seniors the Natural Science majors scored higher on average than
the other majors. These differences by major are statistically significant at the 0.05 level (see
Appendix C for details). This suggests that on average TTU juniors and seniors with a Natural
Science major score better than TTU juniors and seniors with another major on the Science
section of the CAAP.
Average CAAP Scores by GPA
64.0
62.0
60.0
58.0
56.0
54.0
52.0
50.0
62.6
62.0
61.6
62
62.0
60.4
60.3
62.1 61.5
Juniors
54.0
Seniors
below 2.00 2.01 - 2.50 2.51 - 3.00 3.01 - 3.50
3.51 and
above
It appears that on average the seniors from the lowest GPA group scored lower than the other
participants, while there does not appear to be much difference in average scores between the
other GPA groups. The differences in scores by GPA are not statistically significant at the 0.05
level (see Appendix C for details). This suggests that on average participants’ GPA does not
have an impact on participants’ scores for the Science section of the CAAP.
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 11 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Average CAAP Scores by Motivation
64
62
60
58
56
54
52
50
62.6 62.7
61.7 61.1
58.9
59.7
57
56.0
Juniors
Seniors
tried my best gave moderate
effort
gave little
effort
gave no effort
For both the juniors and seniors it appears that on average the participants that reported more
motivation and effort on the assessment scored higher. These differences between the groups
are statistically significant at the 0.05 level (see Appendix C for details). This suggests that on
average TTU juniors and seniors that are more motivated to put forth effort on the assessment
score higher.
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 12 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Additional Questions Scores Overall
The following table and graph give a summary of the sample’s scores on the additional
questions.
Summary of Additional Questions
Scores
Seniors
Juniors
n
Mean
SD
Min
Median
Max
105
50.7%
15.4%
0.0%
55.6%
88.9%
125
53.0%
18.8%
0.0%
55.6%
88.9%
Histogram of Additional Question Scores
30
28
27
23
25
18
20
28
23
20
16
15
Juniors
8
10
5
2
1
3
7
9
8
Seniors
4
2
3
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
It appears that on average the seniors scored slightly higher on these additional questions than
the juniors. This difference was not statistically significant at the 0.05 level (see Attachment D
for details). Both the juniors and seniors answered, on average, between four and five of the
nine additional questions correctly, with a low score of zero questions answered correctly and a
high score of 8 questions answered correctly. When considering the scores for the additional
questions it is important to note that these questions were answered at the end of the testing
period (i.e., after the participants had already taken the 40-minute-long Science Test) and that
participants may have been rushed in trying to complete them. As shown in the table, only 230
of the 331 participants actually answered the additional questions.
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 13 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Additional Questions Scores by Sex, Major, GPA, and Motivation
The following table shows summary statistics for additional questions (AQ) scores by sex and
major (see Appendix F for additional information).
AQ Scores by
Sex and Major
Mean
SD
Min
Median
Max
All Participants
All
Females
Males
51.9%
51.6%
52.2%
18.6%
17.2%
19.7%
Natural Science Majors
All
Females
Males
59.0%
65.7%
61.0%
16.3%
10.5%
18.0%
Other Majors
All
Females
Males
50.2%
49.9%
50.4%
18.6%
19.9%
17.1%
0.0%
55.6%
88.9%
22.2%
66.7%
88.9%
0.0%
55.6%
88.9%
0.0%
55.6%
88.9%
0.0%
55.6%
88.9%
44.4%
66.7%
88.9%
22.2%
66.7%
88.9%
0.0%
55.6%
77.8%
0.0%
50.0%
88.9%
The following graphs show the average additional questions scores by sex, major, and GPA.
Average AQ Scores by Sex
60.0%
53.1% 50.5%
50.0%
55.1%
48.9%
40.0%
30.0%
Juniors
20.0%
Seniors
10.0%
0.0%
Female
Male
The female juniors scored higher on average than the male juniors and the female seniors
scored lower on average than the female seniors. These differences are not statistically
significant at the 0.05 level (see Appendix D for details). The lack of significant results for these
differences may be due to a smaller number of participants finishing the additional questions
and the greater variance in scores for the additional questions.
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 14 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Average AQ Scores by Major
70.0%
60.0%
50.0%
40.0%
30.0%
20.0%
10.0%
0.0%
61.4% 60.6%
48.6% 51.5%
Juniors
Seniors
Natural Science
Major
Other Major
For both the juniors and the seniors the Natural Science majors scored higher on average than
the other majors. These differences by major are statistically significant at the 0.05 level (see
Appendix D for details). This suggests that on average TTU juniors and seniors with a Natural
Science major score better than TTU juniors and seniors with another major on these additional
questions.
Average AQ Scores by GPA
70.0%
58.6%
60.0%
49.5%
50.0%
48.6% 51.6%
55.3%
47.1%
53.0% 55.9%
40.0%
Juniors
30.0%
20.0%
Seniors
11.1%
10.0%
0.0%
below 2.00
2.01 - 2.50
2.51 - 3.00
3.01 - 3.50
3.51 and above
It appears that on average the juniors from the lowest GPA group scored lower than the other
participants, while there does not appear to be much difference in average scores between the
other GPA groups. These differences are not statistically significant at the 0.05 level (see
Appendix D for details). This suggests that on average a participants’ GPA does not have an
impact on participants’ scores for these additional natural science questions.
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 15 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Relationship between CAAP Scores and Additional Questions Scores
The following table shows how participants’ CAAP scores correlate with their overall additional
questions (AQ) scores, with their partial additional questions scores (scores for only the first six
additional questions that have been included in past administrations of the OSA), and with each
separate additional question.
Correlations
with CAAP
Scores
Correlation
p - value
n
AQ
Score
0.34
< 0.0001
230
AQ
Partial
0.30
< 0.0001
230
AQ1
AQ2
AQ3
AQ4
AQ5
AQ6
AQ7
AQ8
AQ9
0.06
0.12
0.19
0.19
0.19
0.17
0.08
0.21
0.12
0.3195 0.0337 0.0010 0.0013 0.0015 0.0062 0.2103 0.0012 0.0678
291
289
283
276
272
269
248
242
237
The correlation between participants’ CAAP scores and participants’ additional questions scores
(0.34) is statistically significant at the 0.05 level. This means that there is a relationship
between participants’ CAAP scores and participants’ additional questions scores. This
relationship may suggest that the CAAP and the additional questions are measuring similar
domains of knowledge or it might suggest that participants’ performance is in some other way
related (e.g., participants that do well on tests do well on both the CAAP and the additional
questions).
Looking at each additional question it appears that the amount of correct responses for most of
the additional questions was significantly correlated with CAAP scores at the 0.05 level. The
correlations between CAAP scores and correct responses for additional questions 1, 7, and 9
were not statistically significant at the 0.05 level. This might suggest that these questions are
less related to the knowledge assessed by the CAAP science section.
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 16 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
The following graph shows how participants’ additional questions scores relate to their CAAP
scores.
CAAP Scores by AQ Scores
75
70
CAAP Score
65
60
55
50
45
40
0.000
0.100
0.200
0.300
0.400
0.500
0.600
0.700
0.800
0.900
1.000
Additional Questions Score
The following table shows a summary of the linear regression model using participants’
additional questions scores as a predictor for participants’ CAAP scores.
Regression Model of CAAP Scores
by Additional Questions Scores
n
F value
p value
R2
230
31.03
< 0.0001
0.1198
Variable Coefficient Stand Error p value
Intercept
57.44
0.77
< 0.0001
AQ Score
7.74
1.39
< 0.0001
The table confirms that participants’ additional questions scores are significantly related to
participants’ CAAP scores. The R2 value of 0.1198 reported in the table means that
approximately 12% of the variance in CAAP scores can be explained by the variance in
additional questions scores. This again suggests that participants’ additional questions scores
and CAAP scores are related.
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 17 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
The following table shows a summary of the linear regression model using participants’ scores
from only the first six additional questions as a predictor for participants’ CAAP scores. The first
six additional questions are the only questions that have been administered in past
administrations of the OSA.
Regression Model of CAAP Scores
by First Six Additional Questions Scores
n
F value
p value
R2
230
22.08
< 0.0001
0.0883
Variable Coefficient Stand Error p value
Intercept
58.13
0.76
< 0.0001
AQ Score
5.22
1.11
< 0.0001
The table shows that participants’ scores for the first six additional questions are also
significantly related to participants’ CAAP scores. The R2 value of 0.0883 reported in the table
means that approximately 9% of the variance in CAAP scores can be explained by the variance
in scores for the first six additional questions. This suggests that including all nine additional
questions provides a better predictor of CAAP scores.
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 18 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Conclusion
Overall it appears that TTU juniors score similarly on average to other juniors nationally and
that TTU seniors score slightly higher on average than other seniors nationally on the Science
Test from the CAAP. Considering the CAAP scores by sex suggests that on average male juniors
and seniors score higher than female juniors and seniors. Considering the CAAP scores by
major suggests that on average juniors and seniors with a Natural Science major score higher
than juniors and seniors with another major. Considering CAAP scores by motivation suggests
that students reporting that they were more motivated in putting forth effort on the
assessment score higher. GPA does not appear to have a significant influence on CAAP scores
for TTU juniors and seniors.
Considering scores for the additional questions by sex, the female juniors scored higher on
average than the male juniors and the female seniors scored lower on average than the male
seniors, but these differences were not statistically significant at the 0.05 level. This lack of
significance might be due to a lot of variance in the additional questions scores. Considering
scores for the additional questions by major suggests that on average juniors and seniors with a
Natural Science major score higher than juniors and seniors with another major. GPA does not
seem to have a significant influence on additional questions scores. The interpretation of
additional questions scores needs to consider that these questions were administered at the
end of participants’ testing. The patterns in additional questions scores are likely influenced by
participants potentially rushing through this last part of the test.
The correlation between additional questions scores and CAAP scores suggests that the
additional questions scores are related to CAAP scores. This is also true for the partial score
including only the first six additional questions that have been administered in past
administrations of the OSA. This relationship may suggest that the CAAP and the additional
questions are measuring similar domains of knowledge or that they are in some other way
related (e.g., students that do well on tests do well on both the CAAP and the additional
questions). Comparing the R2 values for the regression equation using scores for all nine
questions and the regression equation using scores for only the first six questions suggests that
the nine additional questions might be a better predictor of CAAP scores than the first six
questions alone.
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 19 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Appendix
Appendix A: Information about the CAAP Science Test8
8
Screenshots from http://www.act.org/caap/test_science.html (accessed on 11/29/10)
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 20 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 21 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Appendix B: Copy of the Additional Questions
Student ID: R______________________________
INSTRUCTIONS: Please turn to page 3 in your answer document and find block P, Additional
Questions. Each question below corresponds to a column in block P. Read the question and
then mark only one circle in each column that best describes your response. For example, if for
question A you think answer option 7 is correct, you would bubble in the 7 in column A, etc.
QUESTIONS
Column A
Which of the following is NOT a property that defines life?
0.
1.
2.
3.
4.
Metabolism
Heredity
Homeostasis
Movement
Cellular organization
Column B
The increased use of antibiotics in people and livestock is leading to:
0. increased mutation rates in bacteria
1. decreased mutation rates in bacteria
2. selection against antibiotic-resistant bacteria
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 22 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
3. selection for antibiotic-resistant bacteria
4. does not have effect on the evolution of bacteria
Column C
The oldest oceanic crust on Earth formed approximately 160 million years ago. In view of the
fact that Earth is 4.6 billion years old, why is oceanic crust so young?
0.
1.
2.
3.
Oceanic crust is continuously being destroyed by subduction.
Oceanic crust first began to form 160 million years ago; before that no oceans existed.
Older oceanic crust exists but has not been dated.
There is no difference between oceanic and continental crust.
For the following paragraphs select the terms that best complete the blanks in the sentences
(Columns D - F).
Sally the Scientist visited her father, who told her of a new product he is using to cure his
baldness. Called “Bald Away,” it is an oil derived from a rare tropical tree and is very expensive.
His doctor showed him a short film about the product where three men talked about the
effectiveness of the oil and showed before and after pictures of their heads. Sally was worried
that the so-called cure was really just an example of “Pseudoscience” because the only reason
given to accept the conclusions was anecdotal evidence.
Sally decided to study the effectiveness of the oil to help her dad decide whether to purchase
the product or not. She wrote down, “When following the direction on the bottle, use of the oil
leads to improved hair growth on bald men after one week.” This statement serves as her
Column D.___________ for the study. To conduct the experiment, she recruited fifty bald men
and the 25 in Group A were given “Bald Away” and Group B was given another oil that she
knew did not affect hair growth.
0.
1.
2.
3.
non-control group
control group
conclusion
hypothesis
The fifty men served as her sample. Group B served as the Column E.___________ in the
experiment.
0.
1.
2.
3.
non-control group
control group
conclusion
hypothesis
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 23 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
In order to make her study more scientifically valid, she randomly determined which group
each man was put into and made sure that none of them knew if they were using Bald Away or
the other oil. In addition, she had an assistant keep track of which group each man was in so
that she did not know either. Because the men did not know what product they were using and
neither did she, it is called a double blind study and she reduced the bias in the experiment.
After a week, Sally met with each of the men and determined that 18 of them had increased
hair growth. When her assistant told her which group each of the men was in, she found that
21 out of 25 using Bald Away had more hair and 2 out of 25 from the other group showed an
increase. She concluded that her hypothesis was Column F.___________ and prepared a
journal article explaining the experiment and its implications.
0.
1.
2.
3.
absolutely proved
supported
not supported
absolutely disproved
She sent it to the Journal of Bald Studies and the editor sent it to three baldness experts to
evaluate the study, a process called peer review. They concluded that the study was properly
done and the results can be published.
Column G.
You have collected data from a replicated experiment that has examined the impacts of
increasing levels of atmospheric CO2 on the yearly growth of three tree species: red oak,
loblolly pine, and western red cedar. Your results indicate that the different plant species
respond differently to increasing amounts of CO2. In addition, you discover that red oaks
under the highest levels of CO2 have the largest amounts of new shoot growth. How would
you determine if the differences in mean shoot growth among the three tree species in
response to varying levels of CO2 are accurate and not a random response?
0. Conduct a second set of experiments to see if similar results are obtained as in the first.
1. Conduct a statistical analysis of the data to determine if the mean shoot growth values
for each plant/CO2 level combination are significantly different from each other at an
established probability.
2. Rank the values obtained from low to high and determine how close the values are to
each other. This measure of similarity will establish if the outcome is random or not.
3. All of the above are acceptable approaches for determining treatment versus random
effects.
Column H.
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 24 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
When you turn on your kitchen faucet some of the water that comes out of the tap has
already been through several sets of kidneys before you have taken that first sip. Which of
the following statements concerning watersheds, water use planning, and the water cycle
should impact our water use and water management decisions?
0. In many cities and towns, grey water passing through wastewater treatment plants is
directly routed back into rivers and streams after processing. This water is subsequently
used by cities and towns downriver from the point of origin.
1. All of our water, whether it comes from reservoirs, a well, or even a bottle is affected by
how people use the land through which water flows.
2. The water we all drink has been recycled through natural processes, human engineered
systems, or both.
3. The goal of watershed management is to balance multiple, competing demands, using
scientific studies in combination with human judgments to make decisions on how we
use our water resources.
4. All of the above could contribute to developing a comprehensive water management
plan.
Column I.
Tomatoes are highly valued around the world; however fruit that is picked before fully
ripening or allowed to over ripen can compromise the flavor and texture of the harvest
before it reaches the consumer.
Which of the following approaches could be examined in a research context to produce a
more appetizing tomato?
0. Identify and block late-stage tomato-ripening processes so that producers can
selectively activate ripening at the time of sales.
1. Locate the producer closer to the consumer so that farm products can be more rapidly
transported at the time of harvest.
2. Alert consumers when tomatoes are in season so that people can plan when to eat fresh
tomatoes.
3. 0 and 2 are both correct responses
4. 0, 1 and 2 are all correct responses.
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 25 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Appendix C: Analysis of CAAP Scores by Sex, Major, GPA, and Motivation
Analysis of CAAP
Scores by Sex - Juniors
Female
Male
n
68
93
Mean
60.6
62.1
St. Dev.
3.84
4.17
t value
2.29
p value
0.0231
Analysis of CAAP
Scores by Sex - Seniors
Female
Male
n
77
93
Mean
60.6
61.9
St. Dev.
3.73
4.33
t value
1.99
p value
0.0481
Analysis of CAAP Scores
by Major - Juniors
Natural Science Major
Other Major
n
17
144
Mean
64.3
61.1
St. Dev.
4.3
3.9
t value
3.13
p value
0.0021
Analysis of CAAP Scores
by Major - Seniors
Natural Science Major
Other Major
n
23
147
Mean
63.4
61.0
St. Dev.
3.6
4.1
t value
2.67
p value
0.0084
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 26 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Analysis of CAAP Scores
by GPA - Juniors
below 2.00
2.01 - 2.50
2.51 - 3.00
3.01 - 3.50
3.51 and above
n
1
22
41
56
38
Mean
62.0
62.6
62
60.3
62.1
St. Dev.
.
4.58
3.80
3.91
4.08
F value
2.13
p value
0.0795
Analysis of CAAP Scores
by GPA - Seniors
below 2.00
2.01 - 2.50
2.51 - 3.00
3.01 - 3.50
3.51 and above
n
1
12
49
58
48
Mean
54.0
61.6
60.4
62.0
61.5
St. Dev.
.
3.63
4.71
3.69
3.88
F value
1.86
p value
0.1206
Analysis of CAAP Scores
by Motivation - Juniors
tried my best
gave moderate effort
gave little effort
gave no effort
n
37
99
16
2
Mean
62.6
61.7
58.9
57
St. Dev.
4.32
3.85
2.92
1.41
F value
3.88
p value
0.0049
Analysis of CAAP Scores
by Motivation - Seniors
tried my best
gave moderate effort
gave little effort
gave no effort
n
56
78
23
4
Mean
62.7
61.1
59.7
56.0
St. Dev.
4.08
3.70
3.51
4.83
F value
4.61
p value
0.0015
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 27 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Appendix D: Analysis of Additional Questions Scores by Sex, Major, GPA, and
Motivation
Analysis of Additional Questions
Scores by Sex - Juniors
Female
Male
n
45
60
Mean
53.1%
48.9%
St. Dev.
15.0%
20.6%
t value
1.16
p value
0.2498
Analysis of Additional Questions
Scores by Sex - Seniors
Female
Male
n
57
68
Mean
50.5%
55.1%
St. Dev.
18.9%
18.7%
t value
1.36
p value
0.1769
Analysis of Additional Questions
Scores by Major - Juniors
Natural Science Major
Other Major
n
17
88
Mean
61.4%
48.6%
St. Dev.
18.5%
17.8%
t value
2.71
p value
0.0079
Analysis of Additional Questions
Scores by Major - Seniors
Natural Science Major
Other Major
n
20
105
Mean
60.6%
51.5%
St. Dev.
14.6%
19.3%
t value
1.99
p value
0.0492
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 28 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Analysis of Additional Questions
Scores by GPA - Juniors
below 2.00
2.01 - 2.50
2.51 - 3.00
3.01 - 3.50
3.51 and above
n
0
18
27
38
22
Mean
.
58.6%
48.6%
47.1%
53.0%
St. Dev.
.
13.6%
22.3%
18.3%
15.3%
F value
1.89
p value
0.1357
Analysis of Additional Questions
Scores by GPA - Seniors
below 2.00
2.01 - 2.50
2.51 - 3.00
3.01 - 3.50
3.51 and above
n
1
11
39
37
35
Mean
11.1%
49.5%
51.6%
55.3%
55.9%
St. Dev.
.
17.5%
18.6%
18.6%
18.4%
F value
1.78
p value
0.1370
Analysis of Additional Questions
Scores by Motivation - Juniors
tried my best
gave moderate effort
gave little effort
gave no effort
n
24
68
10
1
Mean
57.4%
48.9%
46.7%
44.4%
St. Dev.
16.9%
18.9%
18.0%
.
F value
1.16
p value
0.3348
Analysis of Additional Questions
Scores by Motivation - Seniors
tried my best
gave moderate effort
gave little effort
gave no effort
n
42
58
16
3
Mean
54.5%
54.8%
52.8%
29.6%
St. Dev.
18.5%
18.9%
11.8%
23.1%
F value
2.55
p value
0.0426
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 29 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Appendix E: CAAP Scores and Additional Questions Scores by Student
Classification
Analysis of CAAP Scores
by Student Classification
Junior
Senior
Analysis of Additional Questions
Scores by Student Classification
Junior
Senior
n
161
170
Mean
61.4
61.3
n
105
125
St. Dev.
4.09
4.10
Mean
50.7%
53.0%
t value
0.3
St. Dev.
18.4%
18.8%
p value
0.7643
t value
0.93
p value
0.3545
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 30 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Appendix F: CAAP Scores and Additional Questions Scores by Sex, Major, and
Minor
Natural Sciences Majors
BIOL
CHEM
GEOS
PHYS
Total
Natural Sciences Minors
ATMO
BIOL
CHEM
GEOL
GEOP
PHYS
Total
Male
2
2
24
0
28
Female
2
0
9
1
12
Total
4
2
33
1
40
Male
4
0
5
1
3
1
14
Female
5
2
3
0
0
0
10
Total
9
2
8
1
3
1
24
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 31 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Median CAAP Scores by Sex and Major
Male
Female
Natural Science Major
64.5
64
Other Major
62
60
All
62
60
Median CAAP Scores by Sex and Minor
Male
Female
Natural Science Minor
63.5
63.5
Other Minor
62
60
All
62
60
Median CAAP Scores by Sex and Major or Minor
Male
Female
Natural Science Major or Minor
64
64
Other Major and Minor
62
60
All
62
60
Mean CAAP Scores by Sex and Major
Male
Female
Natural Science Major
63.5
64.4
Other Major
61.7
60.3
All
62.0
60.6
Mean CAAP Scores by Sex and Minor
Male
Female
Natural Science Minor
63.1
63.7
Other Minor
61.9
60.4
All
62.0
60.6
Mean CAAP Scores by Sex and Major/Minor
Male
Female
Natural Science Major or Minor
63.5
63.6
Other Major and Minor
61.6
60.2
All
62.0
60.6
Total
64
61
62
Total
63.5
62
62
Total
64
61
62
Total
63.8
61.0
61.4
Total
63.3
61.2
61.4
Total
63.5
61.0
61.4
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 32 of 33
Collegiate Assessment of Academic Proficiency: Science Test
Fall 2010
Median Additional Questions Scores by Sex and Major
Male
Female
Total
Natural Science Major
66.7%
66.7%
66.7%
Other Major
50.0%
55.6%
55.6%
All
55.6%
55.6%
55.6%
Median Additional Questions Scores by Sex and Minor
Male
Female
Total
Natural Science Minor
55.6%
66.7%
66.7%
Other Minor
55.6%
55.6%
55.6%
All
55.6%
55.6%
55.6%
Median Additional Questions Scores by Sex and Major or Minor
Male
Female
Total
Natural Science Major or Minor
66.7%
66.7%
66.7%
Other Major and Minor
44.4%
55.6%
55.6%
All
55.6%
55.6%
55.6%
Mean Additional Questions Scores by Sex and Major
Male
Female
Total
Natural Science Major
59.0%
65.7%
61.0%
Other Major
50.4%
49.9%
50.2%
All
52.2%
51.6%
51.9%
Mean Additional Questions Scores by Sex and Minor
Male
Female
Total
Natural Science Minor
54.4%
64.8%
58.3%
Other Minor
52.0%
50.8%
51.5%
All
52.2%
51.6%
51.9%
Mean Additional Questions Scores by Sex and Major/Minor
Male
Female
Total
Natural Science Major or Minor
60.3%
64.8%
61.7%
Other Major and Minor
49.9%
49.9%
49.9%
All
52.2%
51.6%
51.9%
Office of Planning and Assessment, Devin DuPree and Sabrina Sattler, December 2010
Page 33 of 33
Download