About_Us_files/ALEKS Study Petlock 4.1a_2009

advertisement
A Study of ALEKS for 8th Grade Pre-Algebra Students
ALEKS Corporation and Ken Petlock
April 10, 2009
Overview
The following is a summary of a paper in progress. The paper details a year-long study
of the ALEKS software among three classes of 8th grade students at a middle school in
California. This summary begins (part I) with a short description of ALEKS and the
theory behind its artificial intelligence, called Knowledge Space Theory. Then, some
details are given about the instructor, students, and the specific implementation of
ALEKS (part II). The next section (part III) contains an analysis of the data collected in
the study, which III) involves two main lines. The first is an examination of the
effectiveness of ALEKS as a learning tool (part IIIA). The second is an examination of
ALEKS as an assessment / diagnostic tool (part IIIB).
I. Description of ALEKS and Knowledge Space Theory
ALEKS (Assessment and LEarning in Knowledge Spaces) is a Web-based, artificially
intelligent assessment and learning system. Its artificial intelligence is based on a field of
study called Knowledge Space Theory (KST); for an extensive database of references,
see http://wundt.uni-graz.at/kst.php. KST is unique among theories guiding assessment and
learning. It allows the representation (in a computer’s memory) of an enormously large
number of possible knowledge states organizing a scholarly subject. Rather than giving a
score or series of scores that describe a student’s mastery of the subject, KST allows for a
precise description of what the student knows, does not know, and is ready to learn next.
According to KST, a subject such as arithmetic, Algebra I, or first-year college chemistry
can be parsed into a set of problem types, with each problem type covering a specific
concept (or skill, fact, problem-solving method, etc.). A student’s competence can then
be described by the set of problem types that she is capable of solving. This set is the
student’s knowledge state. The knowledge space is the collection of all of the knowledge
states that might feasibly be observed in a population. Each subject typically has 200-300
problem types and several million knowledge states.
A. The assessment
The task of a KST assessment is to uncover, by efficient questioning, the knowledge state
of a particular student. Because the number of problem types for a typical subject is large,
only a small subset of them can be used in any assessment. So, the successive questions
must be chosen cleverly.
At the beginning of a KST assessment, each knowledge state is given some initial
probability. A question (problem type) is chosen, and based on the student’s answer, the
probabilities are updated: if the student answers correctly, then each knowledge state
containing that problem type is increased in probability, and if the student answers
incorrectly, then each of those states is decreased in probability. The next question is
chosen so as to be as informative as possible (according to a particular measure), and the
process continues until there is one knowledge state with a much higher probability than
the others. This is the knowledge state assigned to the student.
The process usually takes from 25 to 35 questions. The assessment’s efficiency is
significant, considering that the scholarly subject has about 200-300 problem types and
its knowledge space has millions of feasible states. This efficiency stems from the many
inferences made by the system via the knowledge space.
We note that, because all problem types have either open-ended responses or multiple
choice responses with a large number of choices, the chance of a lucky guess is
negligible. This is another important contrast with many testing devices.
B. The fringes
Each knowledge state is uniquely specified by two much smaller subsets, the inner fringe
and the outer fringe of the state. In everyday terms, the inner fringe is the set of problem
types that represent the high points of the student’s competence, and the outer fringe is
the set of problem types that the student is ready to learn next. These fringes are typically
rather small, and they completely specify a knowledge state that may contain dozens of
problem types. The economy is notable. Moreover, the summary is more informative to
a teacher than is a grade or a percentile and certainly more useful. For example, the
students’ outer fringes can be used by the teacher to gather students in small,
homogeneous groups ready to learn a given problem type, fostering efficient learning.
Note that the outer fringe of a state can be seen as a formal implementation of the zone of
proximal development in the sense of Vygotsky. [reference]
C. The learning mode
The assessment gives precise access to further learning. After being assigned a
knowledge state by the assessment, the student may choose a problem type from her
outer fringe and begin learning. Once the system determines that she has mastered the
problem type, it is added to her knowledge state, and she may choose another problem
type to learn from her outer fringe. Subsequent assessments update her knowledge state.
II. Description of the instructor, students, and implementation of
ALEKS in the current study
The study took place during the 2007-2008 school year at a Middle School in Northern
California which will be referred to as “The Observed Middle School”.
A. The instructor
The instructor, Mr. Petlock, was in his eighth year as a full-time credentialed teacher and
his third year teaching with the ALEKS software. He held a multiple subject teaching
credential with a supplemental authorization in Mathematics. He was hired about two
months into the school year (October 5) to replace a long-term substitute.
B. The students
In this district being studied, 70% of students participated in the free or reduced-price
lunch program, compared to the state average of 51% (Source: CA Dept. of Education,
2007-2008). The majority of students were White (57%).
Students at the observed middle school were ability grouped into math classes starting in
the 7th grade and distributed into six classes. One of the classes was comprised of the
brightest students from the previous year’s 6th grade classes. This class matriculated into
an Algebra I class in 8th grade. Students in the remaining five classes were placed in a
blocked (two period) Pre-Algebra course. At the end of the 7th grade year, another cut of
the brightest students was made to fill one class. These students entered an Algebra IA
course in their 8th grade year and then completed Algebra IB during their 9th grade year in
high school. The remaining 7th graders, about half of the entire 7th grade class, were then
recycled into another Pre-Algebra course the following year (8th grade).
This latter group consisted of the lower half of the (non-special education) 8th grade class,
as determined by grades and California Standards Test (CST) scores. These were the 59
students studied.
C. The implementation
Each class period was 43 minutes. Most instructional time occurred in the computer lab.
The lab was equipped with a digital projector and a large white board in the front of the
room.
Each class had a majority of students using the ALEKS Pre-Algebra product and a
minority using the ALEKS Essential Mathematics product. The instruction consisted of a
mixture of whole class and small group direct instruction, self-directed learning and peer
tutoring, all done chiefly using the ALEKS software. The direct instruction, for example,
was initiated by the ALEKS report giving the “outer fringe” of a student, which are the
topics the student is deemed “ready to learn” by ALEKS. Using this report, the instructor
gave specified instruction on a particular topic to small groups or individuals, allowing
for highly differentiated learning. As a result of the diverse learning needs of the class
as a whole, the majority of direct instruction occurred in small groups or on an
individual level. This instruction was facilitated by a wireless InterWrite Pad
[http://www.einstruction.com/products/interactive_teaching/pad/index.html] and the
Vision Classroom Management Software
[http://www.netop.com/products/education/vision6.htm].
Soon after ALEKS was fully implemented, homework was rarely assigned.1
Students’ course grades were given using the following weights: 60% Assessments
(taken via ALEKS), 20% daily learning progress (amount of ALEKS topics learned per
day), and 20% from other assignments.
In addition to ALEKS, Mr. Petlock used STAR Math, a component of Accelerated Math
(by Renaissance Learning), as an independent measure of math ability, knowledge and
1
According to Mr. Petlock, with their predominantly low socio-economic status, many students did not
have the support structure at home needed to consistently complete an assignment based on classroom
instruction. Furthermore, because ALEKS allows for a highly personalized approach, a one-size-fits-all
homework assignment was neither practical nor appropriate.
growth throughout the year. STAR Math is discussed more in section IIIB of this writeup, in which the effectiveness of ALEKS as an assessment tool is discussed.
D. Note about scientific controls in the study
In reviewing this write-up, the reader will note that the students studied fell into two
mutually exclusive groups: (i) those who were taught by Mr. Petlock and used ALEKS,
and (ii) those who were not taught by Mr. Petlock and didn’t use ALEKS. In other
words, from this study, it is difficult to separate the effect of Mr. Petlock from the effect
of ALEKS.2 Rather, what can safely be concluded from the study is that systematic use
of the ALEKS software by a conscientious teacher can give the results detailed in the
write-up.
III. Data Analysis
There are two lines of data analysis for the study. The first looks at the usefulness of
ALEKS as a learning tool during the 2007-2008 school year. The second examines the
effectiveness of the ALEKS assessment as a diagnostic tool for predicting performance
on the California Standards Test (CST).3
The CST is taken each year by nearly all students in California public schools. There is
a different CST for each grade level, but each has 65 multiple choice questions and is
scored on a scale of 150 to 600. Though the scoring scale is the same for each grade, the
tests are criterion-referenced, in that each score is based in part on the assessed mastery
of the California state standards for that grade.4 As a result, scores from the same grade
and different years (e.g., 8th grade 2007 and 8th grade 2008 scores) can be compared
directly, but scores between grades (e.g., 7th grade and 8th grade scores) cannot. For more
information, see the California Department of Education’s website, especially
http://star.cde.ca.gov/star2008/help_comparescores.asp .
A student’s score gives rise to his performance level. There are five performance levels:
Far Below Basic, Below Basic, Basic, Proficient, and Advanced. The goal of the
California Department of Education is for all students to score at the Proficient or
Advanced levels, so the percentage of students scoring at these two levels is of special
interest.5 We include this percentage in our analysis below.
2
Similar comments may be made about the InterWrite Pad, Vision Classroom Management Software, and
STAR Math Assessment, though to a lesser extent because these items were more secondary in the
instruction than ALEKS was.
3
There are California Standards Tests in several subjects, including Mathematics, Science, EnglishLanguage Arts, etc. For convenience in this write-up, though, we mean only the Mathematics test when we
refer to the CST.
4
Scores adjust for differences in the difficulty of a test from year to year, as some questions are changed.
5
For each test, a score of 300 separates the categories of Below Basic and Basic. A score of 350 separates
the categories of Basic and Proficient. For more information, see
http://www.cde.ca.gov/ta/tg/sr/resources.asp.
A. Learning Improvement Using ALEKS
We first compare the CST results of Mr. Petlock’s students with those of the Pre-Algebra
students at the observed middle school the year before, in 2007.6 Figure 1 shows the
mean CST scores (in red) for the 8th graders in Pre-Algebra at the middle school in 2007
and 2008. In 2008, these 8th graders were Mr. Petlock’s students; in 2007, these 8th
graders were a group taught by another teacher who did not use ALEKS. This latter
group is a reasonable group to which to compare results.
For reference, also shown in Figure 1 are the relevant CST score means for the District,
County, and State.
Looking at Figure 1, we see that Mr. Petlock’s students scored much better than the 8th
grade Pre-Algebra students from the year before at the observed middle school. In
particular, the previous year’s students scored well below the District, County, and State
means, and Mr. Petlock’s students scored well above these means, even with increases in
these means from 2007 to 2008.
Figure 1: Comparison of 2007 and 2008 8th Grade CST Mean Scores
400
350
District
316 318 316
304
308
322 320
335
County
300
State
250
200
8th Grade Pre-Algebra
students at the
observed middle school
150
2007 8th Grade 2008 8th Grade
The scores of the observed school’s 8th grade Pre-Algebra students, shown in red, increased
from well “below average” in 2007 to well “above average” in 2008, where the “average”
refers to the District, County, or State mean. The 2008 8 th grade Pre-Algebra group shown
is Mr. Petlock’s class; the 2007 8th grade Pre-Algebra group shown is a reasonable
comparison group.
6
The other 8th grade math teacher at the observed middle school in 2007-2008 had more advanced students
following a different (Algebra I) curriculum from Mr. Petlock’s students, and a comparison with those
students would not be useful.
A (two-tailed) t-test was performed to compare the mean CST scores for the observed
middle school’s 8th grade Pre-Algebra students in 2007 and 2008. For this t-test, it was
not assumed that the variances of the two populations were equal7 [reference]. This test
is summarized in the table below.
Sample
Mean CST Standard Deviation
Results
Size
Score
of CST Scores
2007
t ≈ 3.644
Observed Middle School
41
304
35
Pre-Algebra
df = 96
(no ALEKS)
p ≈ 0.0004
2008
(two-tailed)
Observed Middle School
58
337
55
Pre-Algebra (ALEKS)
So, there was a significant difference in the mean CST score between the 2007 8th grade
Pre-Algebra students at the observed middle school (who did not use ALEKS) and those
in 2008 (who did use ALEKS).
Note in Figure 1 that the mean for the entire State increased slightly from 2007
(mean=316) to 2008 (mean =320). So, it may be argued that the test in 2008 was “easier”
than in 2007, and that this should be considered when comparing the two sample groups.
We have chosen to do two separate t-tests, one comparing the 2007 observed middle
school’s Pre-Algebra group to the State mean in 2007, and the other comparing the Mr.
Petlock’s 2008 Pre-Algebra group to the State mean in 2008. These t-tests are
summarized in the table below.
Sample
Size
2007
Observed Middle
School
Pre-Algebra
(no ALEKS)
2008
Observed Middle
School
Pre-Algebra (ALEKS)
7
41
58
Mean
CST
Score
304
337
Standard
Deviation of
CST Scores
35
55
“Population
Mean” Being
Compared
Results
316
t ≈ -2.195
df = 40
p ≈ 0.017
(one-tailed)
320
t ≈ 2.354
df = 57
p ≈ 0.011
(one-tailed)
For such a test, it is necessary to compute the degrees of freedom via a formula [reference]. For these
data, the formula gives a value of about 96.1. This was rounded to 96 for the test.
We see that the 8th graders in Pre-Algebra at the observed middle school were
significantly below the State mean in 2007 (without ALEKS) and significantly above the
State mean in 2008 (with ALEKS).
As mentioned, an important measure according to California’s Department of Education
is the percentage of students scoring Proficient or Advanced on the CST (that is, scoring
at least 350). Figure 2 displays a comparison of percentages for the same groups of 8th
graders as in Figure 1.
In Figure 2, we see that the percentage of Mr. Petlock’s students scoring Proficient or
Advanced on the 8th grade CST was nearly triple that of the comparison group of students
from the year before. There was a general increase in these percentages at the County
and State levels from 2007 to 2008, but it appears that Mr. Petlock’s students easily
outdistanced this increase.
Figure 2: Comparison of 2007 and 2008 8th Grade CST Percentages of Proficient or
Advanced
50%
43%
40%
30%
20%
33% 31%
26% 26%
23%
23%
15%
District
County
State
8th Grade Pre-Algebra
at Ishi Hills
10%
0%
2007 8th Grade
2008 8th Grade
The percentage of the observed middle school’s 8th grade Pre-Algebra students who
scored at least Proficient, shown in red, increased from well below the District,
County and State percentages in 2007 to well above these percentages in 2008. The
2008 observed middle school group shown is Mr. Petlock’s class; the 2007 observed
middle school’s group shown is a reasonable comparison group.
A (two-tailed) Z-test was done to compare the percentages for the 2007 and 2008
observed middle school groups. The table below gives the results.
Sample
Size
Percentage of
“Proficient” or
“Advanced”
41
15%
2007
Observed Middle School
Pre-Algebra
(no ALEKS)
2008
Observed Middle School
Pre-Algebra
(ALEKS)
Results
Z ≈ 2.957
58
p ≈ 0.0031
(two-tailed)
43%
We see that there was a significant difference in the percentage of students scoring
Proficient or Advanced between the 2007 8th grade Pre-Algebra students at the observed
middle school (who did not use ALEKS) and those in 2008 (who did use ALEKS).
Again, it may be argued that the test in 2008 was easier than the test in 2007. We
performed two separate Z-tests, one comparing the 2007 observed middle school’s PreAlgebra group to the State percentage in 2007, and the other comparing Mr. Petlock’s
2008 Pre-Algebra group to the State percentage in 2008. These Z-tests are summarized
in the table below.
Sample
Size
2007
Observed Middle
School
Pre-Algebra
(no ALEKS)
2008
Observed Middle
School
Pre-Algebra
(ALEKS)
Percentage
Scoring
Proficient or
Advanced
State
Percentage
Being
Compared
Results
41
15%
23%
Z ≈ -1.217
p ≈ 0.112
(one-tailed)
58
23%
31%
Z ≈ 1.977
p ≈ 0.024
(one-tailed)
The percentage of Mr. Petlock’s students who scored at least Proficient on the 2008 CST
was significantly above the State’s percentage that year. The percentage of 8th grade PreAlgebra students the year before at the observed middle school scoring at least Proficient
was below the State’s percentage, though not significantly so at the 0.05 level.
To give weight to the observations in Figures 1 and 2, we briefly examine the
mathematical level of Mr. Petlock’s students as they entered their 8th grade year. If these
students had already been “above average” when entering their 8th grade year, then it
would be no surprise that they would be above average at the end of that year. An
obvious measure of the students’ mathematical level going into their 8th grade year is
their performance on the 7th grade CST. Figures 3 and 4 show summaries of their
performance in 2007 (before using ALEKS) and in 2008 (after using ALEKS). Figure 3
shows performance relative to the mean performances in the District, County, and State,
and Figure 4 shows the percentage of students scoring Proficient or Advanced in these
same groups.
We see from Figure 3 that the group of 51 students8 taking Mr. Petlock’s 8th grade class
were “below average” in mathematical level going into their 8th grade year – the mean for
the group in 7th grade was below the means for the County and State on the CST 9, and
about average relative to the rest of the District. However, after a year with Mr. Petlock
and ALEKS, the mean for these 51 students jumped ahead of the other means. Further,
the means for the District, County, and State decreased from the 7th grade test to the 8th
grade test, but the mean for the 51 students increased.
400
350
330 336
315
318
308
322 320
336
District
County
300
State
250
Mr. Petlock's 2008 PreAlgebra Students
200
150
2007 7th Grade
2008 8th Grade
Figure 3: Comparison of 2007 7th Grade and 2008 8th Grade CST Mean Scores
The mean scores of the 51 students in Mr. Petlock’s 8th grade Pre-Algebra class are
shown in red. The mean increased from well “below average” in 2007 (before the
students used ALEKS) to well “above average” in 2008 (after using ALEKS with Mr.
Petlock for a year), where the “average” refers to the County or State mean.
8
For these figures, we use the subset of n=51 students (as opposed to the previous n=58 students) for
whom we have CST scores from both 2007 and 2008.
9
This is no surprise because these students were the non-special education students who were ranked
lowest in mathematics ability at the observed middle school, as determined by grades and CST scores.
We emphasize that it is not meaningful to compare scores on the 7th grade CST with
those on the 8th grade CST (http://star.cde.ca.gov/star2008/help_comparescores.asp). So,
we do not perform a statistical test comparing the scores of the 51 students in 2007 and
2008. However, we can say that these scores went from significantly below the State
mean in 2007 to significantly above in 2008, as shown in the table below.
Sample
Size
Mean
CST
Score
Standard
State
Deviation of Mean Being
CST Scores Compared
Results
Mr. Petlock’s
2008 students
in 2007
(no ALEKS)
51
318
44
336
t ≈ -2.921
df=50
p ≈ 0.003
(one-tailed)
Mr. Petlock’s
2008 students
in 2008
(ALEKS)
51
336
55
320
t ≈ 2.078
df=50
p ≈ 0.021
(one-tailed)
Finally, we look at the percentage of the students scoring Proficient or Advanced in 2007
(7th grade) and the percentage in 2008 (8th grade). We see in Figure 4 that there was a
striking increase in the percentage among the 51 students. As seen in the figure, the
percentage of these students scoring Proficient or Advanced nearly doubled from 2007
(before using ALEKS) to 2008 (after using ALEKS with Mr. Petlock), while the
percentages for the District, County, and State stayed the same or decreased from 2007 to
2008.
50%
43%
37%
40%
30%
39%
33%
23%
31%
County
23%
22%
District
20%
State
10%
Mr. Petlock's 2008 PreAlgebra Students
0%
2007 7th Grade
2008 8th Grade
Figure 4: Comparison of 2007 7th Grade and 2008 8th Grade CST Percentages of
Proficient or Advanced
The percentages of the 51 students in Mr. Petlock’s 8th grade Pre-Algebra class scoring at least
Proficient are shown in red. These percentages increased from below the District, County, and
State percentages in 2007 (before using ALEKS) to well above these percentages in 2008 (after
using ALEKS with Mr. Petlock). Note that none of the respective percentages for the District,
County, or State increased from 2007 to 2008, while the percentage for the group of 51 students
taking Mr. Petlock’s class nearly doubled.
Again, we cannot compare the percentage scoring Proficient or Advanced in 2007 (7th
grade) to that in 2008 (8th grade) directly, but we can say that the percentage for the 51
students went from significantly below the State percentage in 2007 to significantly
above in 2008, as summarized in the following table.
Sample
Size
Percentage
Scoring
Proficient or
Advanced
State
Percentage
Being
Compared
Results
Mr. Petlock’s
2008 students
in 2007
(no ALEKS)
51
22%
39%
Z ≈ -2.489
p ≈ 0.006
(one-tailed)
Mr. Petlock’s
2008 students
in 2008
(ALEKS)
51
43%
31%
Z ≈ 1.853
p ≈ 0.032
(one-tailed)
By either measure – comparison of means or comparison of percentage scoring at least
Proficient, – the 51 students were poor performers in 7th grade who improved, seemingly
dramatically, to above-average performers in 8th grade.
Note on Mr. Petlock’s students and the students to which they are being compared
Again, it is clear that the effect of ALEKS on Mr. Petlock’s students cannot be fully
separated from the effect of Mr. Petlock on his students – there is no group of students in
the study who were taught by Mr. Petlock but who did not use ALEKS. So, what can
safely be concluded from the study is that systematic use of the ALEKS software by a
conscientious teacher can result in great learning gains.
B. Diagnostic Usefulness of ALEKS
The question of the validity of the ALEKS assessment has been investigated elsewhere;
see the Falmagne et al. references [to be added]. In their study, Falmagne et al. looked at
how well the response (correct or incorrect) to an “extra question” chosen from the
curriculum could be predicted based on the knowledge state in which the system had
placed the student. The results were very good, with point-biserial correlation values
above those typically obtained for pencil-and-paper standardized tests, and fewer
questions being asked in the ALEKS assessment.
The current study provides another examination of the validity of the ALEKS
assessment. In contrast to the study by Falmagne et al., the current study makes a
comparison to CST scores in measuring validity. In particular, the current study
examines how well the results of the ALEKS assessment may predict the score on the
CST. The outcome clearly has implications for the usefulness of the ALEKS assessment
as a diagnostic tool.
The same group of 58 students used for the learning results above was used for this part
of the study. These students took the CST in May of 2008. A proctored ALEKS
assessment took place in the same month and is the one used for comparison.
We note the important point that the result of an ALEKS assessment is a knowledge state,
which is a list of the problem types that the student has mastered (along with the inner
and outer fringes). As such, to assign a single numerical value to the student’s result is
an extra step, and one that loses information contained in the ALEKS assessment result.
We do so, however, for the purpose of comparison to other testing instruments.
There are several natural candidates for the single numerical value to summarize an
ALEKS assessment. In this study, we use the following two:
(a) The percentage of the ALEKS problem types that are regarded as mastered
There are about 311 problem types in the ALEKS curriculum for Pre-Algebra.
The result of an ALEKS assessment is a list of all of the problem types that the
system regards the student as having mastered. One numerical measure of how
much she knows is simply the size of the set of items that she is regarded as
having mastered, divided by 311.
(b) The percentage of the state standards that are regarded as mastered
ALEKS is aligned with the California state standards for 8th grade (in fact, for all
of grades 3-12, and for all states). This means that, for each California 8th grade
standard, ALEKS has a list of the problem types that match that standard. So,
based on the problem types in the student’s assessed knowledge state, ALEKS
can assign a percentage of the state standards mastered by the student. In the
ALEKS Instructor Module, the criterion for “mastery” may be set at 50%, 60%,
or 70%. If the criterion is set at 50%, for example, then if the student has
mastered at least half of the ALEKS problem types matching Standard A, then
Standard A is regarded as mastered by that student.
In addition to using ALEKS from November to May in 2007-2008, Mr. Petlock’s
students took four “benchmark” assessments during the school year. These benchmark
assessments were the STAR MATH assessments by Renaissance Learning, which are
computerized, adaptive, multiple-choice assessments (http://www.renlearn.com/sm/).
This test generates a variety of useful reports which include norm-referenced data for
each student such as national curve equivalent, percentile ranking, raw score, scaled score
and a grade equivalent.
Here are the relevant Pearson correlations (“r”) computed. The dependent variable each
time is the score on the 8th grade CST test taken May 7, 2008.
Independent
variable
(a) ALEKS assessment May 6, 2008
(a) ALEKS assessment May 30, 2008
(b) ALEKS standards mastery (50%)
(b) ALEKS standards mastery (60%)
(b) ALEKS standards mastery (70%)
Sample
size
58
58
58
58
58
Pearson
correlation, r
0.74
0.77
0.74
0.74
0.74
Statistically significant
(.05 level)?
yes
yes
yes
yes
yes
3rd Qtr Star Math score April 1, 2008
4th Qtr Star Math score May 8 , 2008
58
57
0.64
0.68
yes
yes
The correlations between the ALEKS assessment result and the CST score are
consistently high, statistically significant, and comparable to, even slightly higher than,
those between the Star Math test score and the CST score. The results strongly indicate
that ALEKS provides a valid assessment, at least as far as its ability to predict scores on
the CST.
Concluding remarks
The learning results obtained during the 2007-2008 school year during Mr. Petlock’s
assignment at the observed school were a combination of several key ingredients. These
key ingredients were the successful combination of technology and pedagogy. Amongst
the multiple technology components implemented, the use of ALEKS was central and at
the core of the technology tools utilized. However it must be recognized that even more
important than the technology tools is the realization that the holder of these tools must
possess the requisite knowledge, experience and expertise to put them into successful
practice.
Other Notes:
Previous teacher’s students n=41
District 2007 8th grade n=187
District 2008 8th grade n=183
County 2007 8th grade n=1,527
County 2008 8th grade n=1,519
State 2007 8th grade n=219,921
State 2008 8th grade n=210,743
* Obtained from the EduSoft website, www.edusoft.com .
See http://star.cde.ca.gov/star2007/Viewreport.asp for 2007 scores
See http://star.cde.ca.gov/star2008/Viewreport.asp for 2008 scores
--The 8th grade CST is taken by both 8th graders and 9th graders. We’re only using the
means for the 8th graders.
--The District averages do not include the students at the observed middle school. The
other averages do, because the scores from the observed middle school did not really
affect these averages. This reporting convention was used throughout this report.
Download