The relationship between reading fluency and reading

advertisement
The Relationship Between Reading Fluency and Reading Comprehension for ThirdGrade Students
By Carla M. McConnaughhay
Submitted in Partial Fulfillment of the Requirements for the
Degree of Master of Education
May 2008
Graduate Programs in Education
Goucher College
Table of Contents
List of Tables
i
Abstract
ii
I. Introduction
1
Statement of the Problem
3
Statement of Research Hypothesis
3
Operational Definitions
3
II. Review of the Literature
4
Reading
4
Decoding
5
Reading Fluency
6
Reading Comprehension
8
The Relationship between Reading Fluency and Reading
Comprehension
10
Reading Instruction
13
Summary
14
III. Methods
15
Design
15
Participants
15
Instruments
16
Procedure
18
IV. Results
21
V. Discussion
22
Implications
23
Threats to Validity
24
Comparison with Other Research
26
Recommendations for Future Research
27
References
29
List of Tables
1. Pearson Correlation Between DIBELS Fluency and AACPS
20
Reading Assessment 2, Comprehension Section
2. Simple Analysis of Variance (ANOVA) between DIBELS
Instructional Categories and Comprehension Assessment Mean
Scores and Levels
i
22
Abstract
The purpose of this study was to investigate the relationship between reading
fluency and reading comprehension. A correlational study design was used.
Participants in the study were 50 third-grade students who were enrolled in one
suburban public elementary school. Of the 50 students involved in the study, 31 were
females and 19 were males. Data regarding students’ performance on the Dynamic
Indicators of Basic Early Literacy Skills, Sixth Edition (DIBELS) and the Anne Arundel
County Public Schools Reading Assessment 2, comprehension section was collected
and analyzed using the Pearson correlation. The analysis showed a significant
relationship between third-grade students’ reading fluency rates and reading
comprehension performance. The study also examined the DIBELS instructional
categories (at risk, some risk, low risk) in relation to the comprehension assessment
levels (basic, proficient, advanced) using a simple analysis of variance (ANOVA).
Results from the ANOVA revealed that the instructional categories were highly related
to the mean comprehension score and level of performance. Recommendations for
future research include using a different comprehension measure, selecting participants
from a different grade level, and conducting an experimental study using a fluency
intervention.
ii
CHAPTER I
INTRODUCTION
Overview
The ultimate goal of reading is comprehension and understanding. In third
grade instruction begins to shift from learning to read, to reading to learn. When
students reach middle school a reduced amount of time is spent on
comprehension strategies and skills, and students are expected to understand
what they are reading. As a result, many students struggle with reading
comprehension.
Research has been done to identify ways to solve this problem. Different
interventions have been developed and there are numerous articles and books
written on which strategies and techniques can best teach children to understand
what they are reading. Comprehension is the basis for reading, and in order for
students to obtain and use effective comprehension skills and strategies they
must possess a variety of skills, including decoding and fluency (Pardo, 2004).
In recent years fluency has become a topic of interest in education. It is
often believed that fluency can be the link between decoding and
comprehension. Decoding refers to a child’s ability to recognize words. Word
recognition skills can be taught through phonemic awareness and phonics. For
many readers problems with word recognition can lead to problems with fluency,
which can lead to problems with comprehension. According to Armbruster, Lehr,
and Osborn (2001), less fluent readers focus their attention on decoding words,
leaving less attention for comprehension. When students begin to develop
1
decoding skills and word recognition becomes natural and automatic, gains in
fluency and comprehension can be made. Fluency also allows the reader to see
that meaning is not only carried through by words, but by expression,
punctuation, and phrasing (Rasinski, 2003). Once a student can learn to
accurately, effectively, and effortlessly decode words he or she can begin to
naturally read passages and stories, and can focus on understanding.
The relationship between reading fluency and reading comprehension is
of considerable interest because it has significant implications for assessment
(Wood, 2006). Due to the fact that high-stakes testing is timed, it is very
important that students are reading the testing material quickly and accurately,
and are able to comprehend what they are reading. A recent study by Wood
found a strong relationship between oral reading fluency and performance on the
Colorado Student Assessment Program (CSAP) for third, fourth, and fifth
graders. The CSAP is designed to measure reading comprehension and to
assess state standards in reading comprehension at each grade level. It was
found that oral reading fluency predicted CSAP reading performance equally well
for third, fourth, and fifth grade, indicating that the relationship between fluency
and comprehension is consistent across the intermediate grades (Wood).
Results of this study support the idea that short “curriculum-based measures of
reading fluency can provide important indicators of the abilities required to
perform well on standards-based reading achievement tests” (Wood, p. 100).
This demonstrated relationship between fluency, comprehension and reading
2
performance suggests that fluency instruction and interventions can have an
effect on reading comprehension and increase reading assessment scores.
Statement of the Problem
This study examined the relationship between reading fluency and reading
comprehension. The study was designed to determine if students’ fluency rates
are related to their reading comprehension.
Hypothesis
There will be a significant relationship between third-grade students’
reading fluency rates and reading comprehension performance.
Operational Definitions
Reading comprehension performance was defined in this study as a
student’s overall score on the Anne Arundel County Public Schools Reading
Assessment 2, comprehension section. The scores were calculated into
percentages. Percentages were defined by the county reading office as follows:
scores of 59% and below are “basic,” scores 60% to 79% are “proficient,” and
scores 80% and above are “advanced.”
Reading fluency was measured by students’ performance on the Dynamic
Indicators of Basic Early Literacy Skills, Sixth Edition (DIBELS). DIBELS defines
fluency rate as the number of correct words read per minute. Fluency rates are
divided into three categories: at risk, some risk, and low risk. An oral reading
fluency score of 66 or fewer correct words read per minute is considered “at risk.”
A score between 67 and 91 correct words read per minute is defined as “some
risk,” and an oral fluency score of 92 or higher is “low risk.”
3
CHAPTER II
REVIEW OF THE LITERATURE
This literature review examines the relationship between reading fluency
and reading comprehension. The first section offers an overview of the
components in reading. A brief synopsis about decoding is examined in section
two. Section three provides an introduction to reading fluency. In section four
reading comprehension as well as instructional practices and strategies are
discussed. Section five explores the relationship between reading fluency and
reading comprehension. Reading instruction and effective methods are
investigated in section six.
Reading
Reading is the process of deriving meaning from written or printed text
(Alvermann & Montero, 2003). It is a complex process which includes many
components. According to Armbruster et al. (2001), phonemic awareness,
phonics, vocabulary, fluency, and comprehension are the five major areas of
reading. Alvermann and Montero believe instruction in phonemic awareness,
phonics, and fluency impact children’s early reading development. It is necessary
for a child to learn and understand each area in order for a child to achieve
reading success. Phonemic awareness is necessary for the development of
phonics; phonics is necessary for word recognition; word recognition is
necessary for fluency; and fluency is necessary for reading comprehension
(Eldredge, 2005). Pardo (2004) emphasized the relationship shared between all
components of reading when noting that, before establishing good
4
comprehension skills, students must acquire decoding skills, fluency skills,
background knowledge, vocabulary, motivation, and engagement
Decoding
Decoding is the process of recognizing letters and sounds in order to read
words. Effective readers use decoding skills to translate printed text into the
sounds of language. These skills involve instruction in phonics, phonemic
awareness, and word recognition. As decoding skills become more proficient for
a child, less attention can be spent on identifying what a word is and more time
can be spent identifying what the word means.
Fluency is seen as the link between decoding and comprehension.
Problems with fluency may stem from poor decoding skills. A recent study
conducted by Rasinski and Padak (1998) reviewed a large number of remedial
readers and found almost all the children were well below grade level in
comprehension, decoding, and fluency. Fluency was the biggest area of concern
due to the lengthy manner in which the students decoded the words and read the
passages. Since decoding and word recognition skills were so poor, it made it
difficult for the students to comprehend any of the passages (Rasinski & Padak).
Students may view reading as pronouncing words correctly and may not focus on
comprehension. When students read words automatically they have good
accuracy, and speed is not interrupted by frequent attempts to decode words.
This automatic reading can free a student’s attention to focus on comprehension
skills and strategies, and can promote a better understanding of the text.
5
Reading Fluency
Reading fluency is the ability to read text accurately and quickly (Hudson,
Lane, & Pullen, 2005). It is a set of skills that allows readers to rapidly decode
text while maintaining high comprehension (Hudson et al.). Fluency also involves
reading a text with proper expression. There are three major components of
fluency: accuracy, which refers to the person’s ability to read words correctly;
rate, the speed a person reads; and prosody, which is commonly referred to as
reading with feeling and involves the stress, intonation, and pauses when reading
(Hudson et al.; Rasinski, 2006). Fluency is often considered the bridge between
word recognition and comprehension (Armbruster et al., 2001; Pikulski & Chard,
2005; Walczyk & Griffith-Ross, 2007). According to Rasinski, “readers must be
able to decode words correctly and effortlessly and then put them together into
meaningful phrases with appropriate expression to make sense of what they
read” (p. 704). A recent study conducted by Eldredge (2005) suggested that
phonemic awareness and word recognition were a precursor of fluency. Kuhn
(2004) believes one important reason for the need of fluency instruction is that
fluent readers no longer have to decode the majority of the words they
encounter, but instead can recognize words accurately and automatically. This
can allow for readers to shift their focus to comprehension and provides the main
reason why fluency is so important.
Instruction and Interventions
Fluency instruction includes modeling oral reading rates, providing direct
instruction, providing readers with text at their independent reading level,
6
providing multiple opportunities to repeatedly read familiar text independently,
and providing opportunities to practice reading (Chard, Vaughn, & Tyler, 2002;
Hudson et al., 2005; Kuhn, 2004). Instruction should also provide word-study
activities to build accuracy. Fluency is not a reading program itself, but “part of a
comprehensive reading program that emphasizes both research-based practices
and reading for meaning” (Hudson et al., p. 708). This implies that fluency should
be woven into all aspects of reading instruction.
Modeling is a very important aspect of fluency instruction. Students need
to hear and see what fluent reading sounds like. Modeling is the basis of all good
fluency instruction. Teachers can implement daily classroom practices such as
read alouds, books on tape, and partner or buddy reading to provide modeling
(Armbruster et al., 2001). By using guided oral instruction, fluency can also
increase. Some techniques include choral, echo, phrase, and punctuation
reading (Armbruster et al.). All of these methods provide practice with accuracy,
rate, and prosody.
Another method of fluency instruction is the use of repeated readings.
With repeated readings students read a passage or story several times and are
given guidance and instruction from their teacher. According to researchers,
repeated reading can be a useful technique when instructing students to read
fluently (Armbruster et al., 2001; Chard et al., 2002; Hudson et al., 2005; Kuhn,
2004; Rasinski, 2006). The National Reading Panel investigated two approaches
to teaching fluency: repeated reading and independent silent reading. It was
7
found that repeated reading improved overall fluency and reading achievement,
as well as comprehension (Armbruster et al.).
Reading Comprehension
Reading comprehension can be defined as the level of understanding of a
passage or text (Bouchard & Trabasso, 2003). It is a “process in which readers
construct meaning by interacting with text through the combination of prior
knowledge and previous experience, information in the text, and the stance the
reader takes in relationship to the text” (Pardo, 2004, p. 272). The ultimate goal
of reading is to understand what has been read (Nation & Angell, 2006).
Comprehension is the reason for reading. It involves a complex process that
includes many skills and strategies (Kolić-Vehovec & Bajšanski, 2006; Nation &
Angell; Pardo). To be a good reader it is critical to not only be able to identify the
words, but to understand them as well. If readers can read the words, but do not
understand what they are reading, they are not really reading. This process
requires a numbers of skills, from recognizing individual words to “forming a
coherent and cohesive mental model of a text” (Nation & Angell, p. 86). Effective
reading comprehension is the culmination of mastering vocabulary, phonics,
fluency, and reading comprehension skills (Dougherty-Stahl, 2004).
Instruction and Strategies
Effective instruction includes direct explanation, modeling, guided practice,
and application (Armbruster et al., 2001; Bukowiecki, 2007; Kolić-Vehovec &
Bajšanski, 2006). Comprehension skills should be taught and applied before,
during, and after reading takes place (Bukowiecki). Instruction in comprehension
8
can help students understand what they have read, remember what they have
read, and communicate to others what they have read (Armbruster et al.). A
primary method of teaching reading comprehension is modeling reading
comprehension skills, a technique that accelerates the improvement of reading
comprehension. Teachers must model effective comprehension strategies.
According to Armbruster et al., “text comprehension can be improved by
instruction that helps readers use specific comprehension strategies” (p. 49).
Bukowiecki further asserts, “classroom instructors must model and directly teach
students specific strategies that will enable them not only to understand the
meaning of individual words, but also to comprehend the meaning of the entire
text” (p. 61).
Comprehension strategies must be explicitly taught, and scaffolding
should be used to ensure appropriate utilization of the strategies (DoughertyStahl, 2004). Effective strategies include making predictions, drawing
conclusions, making inferences, monitoring and clarifying, asking questions,
connecting events to prior knowledge, visualizing, and summarizing (Nation &
Angell, 2006). Dougherty-Stahl reported that good readers apply numerous
comprehension strategies such as predicting, visualizing, making inferences,
monitoring, synthesizing, and summarizing. These strategies “have the potential
to provide access to knowledge that is removed from personal experience” and
allows readers to understand and recall more of what they read (DoughertyStahl, p. 598). In a recent study conducted by Kolić-Vehovec and Bajšanski
(2006), upper elementary school children’s use of comprehension monitoring, a
9
strategy used by readers to monitor their understanding as they read, revealed a
significant improvement in text-level comprehension. The correlations showed
that comprehension monitoring is considerably and consistently associated to
reading comprehension for upper elementary school-aged children (KolićVehovec & Bajšanski). Furthermore, reading comprehension can be developed
by teaching comprehension strategies and by helping readers use those
strategies, flexibly and in combination (Armbruster et al., 2001; Bukowiecki,
2007). By providing direct explanation, modeling, guided practice, and application
teachers can ensure the comprehension success of their students.
The Relationship between Reading Fluency and Reading Comprehension
Poor Fluency Can Have an Effect on Reading Comprehension
Comprehension is not guaranteed with fluency, but it is difficult without
fluency. If a reader has to frequently stop to figure out unknown words, most
likely the reader will not remember or understand much of what is read (Perfetti,
1985, 1999; Pikulski & Chard, 2005; Samuels & Flor, 1997). Often students
skilled in comprehension read faster than students with poor reading
comprehension (Jenkins, Fuchs, van de Broek, Espin, & Deno, 2003). Fluent
readers recognize words and comprehend at the same time, whereas less fluent
readers must focus their attention on figuring out the words, leaving them little
attention for understanding the text (Armbruster et al., 2001; Perfetti; Samuels &
Flor). When gains are made in fluency, readers can focus their attention on
comprehension and understand more of what is read (Pikulski & Chard).
10
If children are too focused on word reading, then little remains for higherlevel comprehension (Pikulski & Chard, 2005). Two theories, the automaticity
theory (Samuels & Flor, 1997) and the verbal efficiency theory (Perfetti, 1985,
1999), highlight the harmful effects of inefficient fluency skills on comprehension.
According to both theories, beginning readers first concentrate on word reading
and gradually shift their attention to what they read and understand (Perfetti;
Samuels & Flor). Perfetti suggested that when readers focus attention heavily on
decoding accurately, less attention is available for comprehension. However,
when decoding becomes automatic, requiring little attention, more attention may
be allocated for comprehending a text (Perfetti). Thus, a direct relationship can
be assumed between fluency and reading comprehension.
According to Hudson et al. (2005), each aspect of fluency has a clear
connection to reading comprehension. For example, inaccurate word reading can
lead to misinterpretations of the story, poor automaticity can strain the reader’s
ability to construct ongoing interpretation of the story, and poor prosody can lead
to confusion through inappropriate groupings of words or the inappropriate use of
expression (Hudson et al.).
Fluency Instruction Can Have an Effect on Reading Comprehension
The National Assessment of Educational Progress (NAEP) found a close
relationship between fluency and reading comprehension (Armbruster et al.,
2001). A representative sample of the nation’s fourth-grade students who scored
low on fluency measures also scored low on comprehension measures
(Armbruster et al.). This suggests that fluency is often neglected in many
11
classrooms across the country and may be affecting many students’ reading
comprehension. All three fluency areas -- accuracy, rate, and prosody -- need to
be developed for effective comprehensive reading instruction for students
(Hudson et al., 2005). Although some readers may recognize words
automatically in isolation or on a list, they might not read the same words fluently
when they appear in context. It is important to provide students with instruction
and practice in fluency as they read (Pikulski & Chard, 2005). A study conducted
by Jenkins et al. (2003) revealed that context fluency, which is accurately reading
words in context, was a stronger predictor of comprehension than list fluency.
The study suggested that “context fluency captures significant comprehension
processes beyond those measured by pure word-list fluency” (Jenkins et al., p.
725). These findings can allow teachers to use a measure of context fluency to
estimate overall reading comprehension.
According to Reutzel and Hollingsworth (1993), fluency development
showed a positive effect on second graders’ reading comprehension. The study
assessed the effects of developing second-grade students’ oral reading fluency
using the oral recitation lesson (ORL) and the effects that fluency training had on
reading comprehension (Reuzel & Hollingsworth). Results of this study found that
the performance of students who participated in the ORL group was “superior to
that of the control group” (Reutzel & Hollingsworth, p. 329), suggesting fluency
development had a strong effect on reading comprehension (Reutzel &
Hollingsworth).
12
Repeated reading is considered the most commonly recommended
procedure for improving reading rate (Armbruster et al., 2001). Repeated
reading of text aimed at developing fluency also may be related to improvement
in students’ reading comprehension (Reutzel & Hollingsworth, 1993). When
accuracy and reading rate are considered together, reading rate accounts for a
significant difference in reading comprehension, suggesting that rate is more
related to comprehension than accuracy (Jenkins et al., 2003). A study
conducted by O’Conner, White, and Swanson (2007) found that repeated reading
not only improved reading rate, but also word identification and reading
comprehension for below-level readers in grades two through four. This
suggested that repeated and monitored oral reading improved reading fluency
and overall reading achievement.
Reading Instruction
Effective reading instruction involves numerous components, including
phonemic awareness, phonics, vocabulary, fluency, and comprehension.
Teachers are responsible for modeling appropriate reading skills. According to
Bukowiecki (2007), no matter the “age, grade level, and reading proficiency of
the student, the teacher is a valuable component in the reading act” (p. 59).
Teaching decoding skills, helping students to build fluency, building and
activating prior knowledge, teaching vocabulary words, motivating students, and
engaging students in personal responses to the text are all considered effective
methods of instruction (Pardo, 2004). It is the educator’s responsibility to ensure
that effective methods are employed and strategically taught.
13
Summary
The ability to read is crucial for a student’s success both in and out of
school. Effective reading instruction is necessary for success in reading. In the
classroom, it needs to be recognized that knowing how to read is much more
than being able to identify the words on a page; it is being able to understand
what is being communicated as well. Thoughtful attention to fluency can have a
positive impact on reading comprehension. The ability to read fluently can
increase reading comprehension, and by focusing on fluency instruction,
educators can impact reading achievement.
14
CHAPTER III
METHODS
The purpose of this research study was to determine whether or not a
relationship exists between reading fluency and reading comprehension, and, if
so, to what degree.
Design
The study used a correlational design in order to gain insight into the
relationship between two variables: reading fluency and reading comprehension.
Participants in this study completed two measures that assessed their level of
reading fluency and reading comprehension. The results of both assessments
were then correlated to determine the relationship between the variables. Both
assessments were completed over a three-week period.
Participants
The participants used for this research were 50 third-grade students
ranging in age from eight to nine years old from Manor View Elementary School.
The sample consisted of 31 females and 19 males. The participants were
primarily Caucasian (60%) and African-American (30%). Other ethnic groups
represented included Latin Americans (4%), Asians (2%), and Pacific Islanders
(2%).
The participants were selected randomly from a group of 92 students
using a table of random numbers. This ensured a random sample was used to
conduct the study.
15
Manor View Elementary School is a public school located in Anne Arundel
County on Fort George G. Meade. The population is diverse and consists of 99%
military families. Due to the military lifestyle, Manor View has a very transient
population. The students represent a wide range of socio-economic status levels,
from lower to upper middle class.
Instruments
This study used two instruments: the Dynamic Indicators of Basic Early
Literacy Skills, Sixth Edition (DIBELS) and the Anne Arundel County Public
Schools Reading Assessment 2, comprehension section, for third grade.
DIBELS is designed to be given individually to students in grades
kindergarten through third. It is intended to identify and monitor those students
who are unlikely to meet state reading standards in third grade. DIBELS consists
of seven different assessments: Letter Naming Fluency, Initial Sound Fluency,
Phoneme Segmentation Fluency, Nonsense Word Fluency, Oral Reading
Fluency, Oral Retelling Fluency, and Word Use Fluency. Normative data was
collected between 1997 and 2001 by the Early Childhood Research Institute at
the University of Oregon. Participants were from kindergarten, first, second, and
third grade classrooms in two elementary schools (University of Oregon, 2003).
Entering scores into the online system also allows for a comparison with 300
school districts, 600 schools, and 32,000 children (Shanahan, 2004).
The researcher used DIBELS Oral Reading Fluency (DORF) to conduct
an assessment of students’ reading fluency. The DORF is a standardized test of
accuracy and fluency. Passages used in the DIBELS Oral Reading Fluency
16
measures were gathered from the Test of Oral Reading Fluency. Readability was
determined using the Micro Power & Light readability software, and Spache
readability was used to revise and refine passages to keep the readability in a
target range for each grade (Good & Kaminski, 2002). Passages included both
fiction and nonfiction stories.
The official DIBELS web site (University of Oregon, 2003) displays
reliability and validity information for the DORF; alternate form reliability ranges
from .89 to .96 and concurrent validity ranges from .91 to .96. These results are
consistent with reviews found in Mental Measures Yearbook (Shanahan, 2004).
According to Shanahan, DIBELS, particularly DORF, seemed to have fairly high
levels of test-retest (.92-.97) and alternative form (.92) reliability, as well as high
predictive and concurrent validity when compared to the Woodcock-Johnson
Reading Tests and other measures. The predictive validity coefficients were .66
and the average concurrent validity coefficients were .80 for the DIBELS Oral
Reading Fluency (Shanahan). Shanahan found DIBELS to be useful in the
classroom for its intended purpose; however, more information about the
discriminant validity with regard to the instructional categories used (at risk, some
risk, low risk) would be favored.
The Anne Arundel County Public Schools Reading Assessment 2 for third
grade is a group administered, timed assessment. It is designed to measure
student performance in reading. Standards, indicators, and objectives are within
the Maryland Voluntary State Curriculum for Reading. The assessment is divided
into three sections: word study, vocabulary, and comprehension. The researcher
17
used the comprehension section to assess students’ comprehension level. The
comprehension section included five short passages, fiction and nonfiction, with
17 selected-response and four brief-constructed-response items.
Test items for the Reading Assessment were purchased from a
standardized item bank published by Harcourt. The items were selected by pvalues. The p-value refers to the test item’s difficulty level. It is calculated as the
proportion of a specific group that answers a test item correctly. p-values range
in value from 0.0 to 1.0, with lower values corresponding to more difficult items
and higher values corresponding to easier items. During test construction, the
Anne Arundel County Reading Office attempted to average out the p-values so
that the test was close to a 0.6 p-value. This information only applies to the
selected response items. The Reading Office constructed the brief-constructedresponse items, so they are less reliable statistically. The Anne Arundel County
Testing and Accountability Office has run studies that indicate the benchmark
assessments are very good predictors of MSA success, believed to be a 0.8
correlation (K. Callison, personal communication, February 17, 2008).
Procedure
Student performance on DIBELS was measured by having students
individually read three different passages aloud for one minute. The researcher
informed the participants they would be reading three different stories aloud and
would be timed for one minute on each story. The researcher pointed to the first
word of the first passage, asked the student to begin, and started the stopwatch
when the student said the first word. Words omitted, substituted, and hesitations
18
of more than three seconds were scored as errors. When the minute was up, a
bracket was placed after the last word provided by the student. The number of
correct words per minute was the oral reading fluency total for that passage and
was recorded. The procedure was repeated for the next two passages. The
median score of the three passages was recorded as the oral reading fluency
rate. The rate was divided into three categories: at risk, some risk, and low risk.
An oral reading fluency score of 66 or less was considered “at risk.” A score
between 67 and 91 correct words read per minute was defined as “some risk,”
and an oral fluency score of 92 or higher was “low risk.”
Participants were given 65 minutes to independently complete the reading
comprehension assessment. Students were instructed to read five short
passages and had to answer 17 selected-response and four brief-constructedresponse items. The participants recorded their answers for the selectedresponse items on a scantron answer sheet, and brief-constructed-response
items were answered on a separate response sheet. The selected-response
items were scored using a scantron machine. The brief-constructed-response
items were scored by four third-grade teachers using a rubric system of 3, 2, 1,
or 0, three being the highest score possible. A scoring tool and sample
responses were provide by the test maker and used during scoring. Results of
the comprehension assessment were scanned for each participant and student
profile sheets were created. The scores were calculated into percentages and
the county reading office defined each percentage as follows: 59% and below is
19
a “basic” score, 60%-79% is a “proficient” score, and 80% and above is an
“advanced” score.
20
CHAPTER IV
RESULTS
The purpose of this research study was to determine whether or not a
relationship exists between reading fluency and reading comprehension, and, if
so, to what degree. The Dynamic Indicators of Basic Early Literacy Skills, Sixth
Edition (DIBELS) was used to measure reading fluency and the Anne Arundel
County Public Schools Reading Assessment 2, comprehension section, was
used to measure reading comprehension. Fifty third-grade students were
randomly selected and results of both assessments were analyzed using a
Pearson correlation. The results of the analysis are presented in Table 1 below.
Table 1
Pearson Correlation Between DIBELS Fluency and
AACPS Reading Assessment 2, Comprehension Section
Measures
Pearson
Correlation
DIBELS Fluency
AACPS Reading Assessment 2, Comprehension Section
0.783*
*p < 0.001
The data supports the hypothesis that there will be a significant
relationship between third-grade students’ reading fluency rates and reading
comprehension performance.
21
CHAPTER V
DISCUSSION
The purpose of this research study was to determine whether or not a
relationship exists between reading fluency and reading comprehension, and, if
so, to what degree. The results suggested a significant relationship between
third-grade students’ reading fluency rates and reading comprehension
performance. Based on the analysis of the data using the Pearson correlation,
the relationship between the scores on the Dynamic Indicators of Basic Early
Literacy Skills, Sixth Edition (DIBELS), and the Anne Arundel County Public
Schools Reading Assessment 2, comprehension section, was statistically
significant (r = .783, p < 0.001). This indicated that the strength of association
between the variables (fluency and comprehension) was very high and the
correlation coefficient was significantly different from zero (p < 0.001). “p < 0.001”
means that the probability was less than 0.1 percent that the observed
relationship was due to chance alone. In summary, a higher score on the
DIBELS was associated with a higher score on the comprehension assessment,
and a lower score on DIBELS was related to a lower score on the
comprehension assessment.
A further analysis was used to investigate the DIBELS instructional
categories (at risk, some risk, low risk) in relation to the comprehension
assessment levels (basic, proficient, advanced). A simple analysis of variance
(ANOVA) was used to determine if there was a significant difference among the
22
DIBELS instructional categories and the means of the comprehension scores.
The results of the analysis are presented in Table 2 below.
Table 2
Simple Analysis of Variance (ANOVA) between
DIBELS Instructional Categories and
Comprehension Assessment Mean Scores and Levels
DIBELS Instructional
Comprehension Mean (%)
Categories
Comprehension
Assessment Level
Low Risk
87%
Advanced
Some Risk
76%
Proficient
At Risk
38%
Basic
Results from the ANOVA revealed that the instructional categories were
highly related to the mean comprehension score and level of performance.
Students in the “at-risk” category for DIBELS scored significantly lower on the
comprehension assessment than those students in the “some-risk” and “low-risk”
category.
Implications
This study is very valuable from an educator’s perspective and provides
helpful data for reading instruction. The results indicate how important fluency is
for a reader and how it can be related to achievement in reading comprehension.
Comprehension is a complex process and by focusing some attention on fluency
skills during reading instruction, a teacher can help ease this process.
23
The instructional categories provided by DIBELS can be a useful tool for
any teacher. Teachers could give DIBELS in the early part of the school year and
determine who their “at-risk” students are. Once a teacher has indentified that
subgroup, efforts could be made to work on fluency skills as well as
comprehension strategies. These categories could also be used to predict the
outcome for the comprehension assessment. Preventive measures could be
taken to aid these students in both fluency and comprehension. Additionally,
specific interventions in fluency might be used with the “at-risk” students. It is
also important to note that if a student is a fluent reader, less attention could be
focused on fluency skills and more to other comprehension skills and strategies,
such as vocabulary development.
Threats to Validity
There are several threats to validity in this study. Both measures used lack
technical evidence for validity and reliability. Most of the validity and reliability for
DIBELS was reported by the maker of the test and can be found on their website.
According to Brunsman (2004), the documentation provided by DIBELS for the
reliability of the scores and the evidence of validity for the described purposes is
inadequate. Furthermore, Brunsman argues that additional information on the
reliability and validity is insufficient to support the use of the DIBELS instructional
categories and that the developers did not describe any studies investigating the
predictive relationship of DIBELS scores to state assessments of reading
standards.
24
The Anne Arundel County Public Schools Reading Assessment 2 for third
grade was developed by reading teachers in the county and has little evidence of
reliability and validity. Although test items for the Reading Assessment were
purchased from a standardized item bank published by Harcourt, this was done
based on the p-value of the items and only for the selected response items. The
Reading Office developed the brief-constructed-response items, so they are less
reliable statistically. All of the information about the reliability and validity of this
test was reported by the test maker and therefore could be considered
inadequate. Another important threat is that the assessment was created in
2001, therefore causing the norms to be outdated.
An additional threat to validity is related to the scoring of the briefconstructed-response items. These items were scored by four third-grade
teachers using a rubric system of 3, 2, 1, or 0, three being the highest score
possible. A scoring tool and sample responses were provided by the test maker
and used during scoring; however, the reliability of the scoring can come into
question due to the subjectivity of these items. The items were written responses
and not all students’ responses coincided with the sample responses and
therefore were left to the interpretation of the scorer.
One final threat is related to the teachers and the amount of test
preparation before the comprehension assessment. The participants in this
research were taught by four different teachers, each with their own teaching
style and level of expertise. Of the 50 students in the sample, 12 were taught by
teacher one, 11 by teacher two, 8 by teacher three, and 19 by teacher four.
25
While some of the teachers found it necessary to take time to review possible
material on the test, others did not. In some of the classes, sample briefconstructed-response items that were found on the assessment were given to
students prior to the test. In addition, the assessment was administered by
different teachers, and different motivational or encouragement strategies may
have influenced a student’s performance.
Comparison with Other Research
Results of this study help to support research signifying the relationship
between reading fluency and reading comprehension. In a recent study
conducted by Wood (2006), a strong relationship was found between oral
reading fluency and performance on the Colorado Student Assessment Program
(CSAP) for third, fourth, and fifth graders. It was found that oral reading fluency
predicted CSAP reading performance equally well for third, fourth, and fifth
grades (Wood). The research reported by Wood supports the current study by
emphasizing the relationship between reading fluency and reading
comprehension and the use of the instructional categories provided by DIBELS.
An additional study by the National Assessment of Educational Progress
(NAEP) found a close relationship between fluency and reading comprehension
(Armbruster et al., 2001). A representative sample of the nation’s fourth-grade
students who scored low on fluency measures also scored low on
comprehension measures (Armbruster et al.). This study is similar to the current
research findings that students who scored “at risk” on the DIBELS assessment
26
also performed poorly on the comprehension assessment, achieving at the
“basic” level.
Furthermore, Rasinski (2003) supports that there is a relationship between
a student’s lack of fluency and comprehension problems. Rasinski stated,
“students struggle so much with fluency, and in putting so much cognitive effort
into the task, that little is left over for understanding the text” (p. 35). Rasinski’s
statement directly supports the results of the current study by suggesting that
those students who scored below average on the comprehension assessment
did so based on their level of fluency, according to their DIBELS score.
Recommendations for Future Research
Suggestions for future research include using a different comprehension
measure, selecting participants from different grade levels, and conducting an
experimental study using a fluency intervention.
DIBELS is a widely-used and popular tool for assessing fluency. The
same cannot be said for the Anne Arundel County Public Schools Reading
Assessment. This assessment is used only in Anne Arundel County; therefore,
these findings would be difficult to generalize to other school districts. However, if
a study were conducted using the Maryland School Assessment (MSA), the
results could be considered more comprehensive and be used in the state of
Maryland. In addition a study investigating the predictive relationship of DIBELS
scores to state assessments of reading standards would be significant and useful
for many school systems.
27
The researcher also recommends using participants from different grade
levels. The participants used for this study were all third-grade students and
results only pertain to that grade level. By using students in fourth and fifth
grades and middle school students as well, the results could be generalized to a
larger population. Also, results would indicate whether or not fluency still remains
a factor in students’ comprehension levels as they get older.
Furthermore, conducting an experimental study using a fluency
intervention could lead to establishing a relationship between reading fluency
interventions and reading comprehension. The researcher suggests preassessing students in reading comprehension and fluency and then providing a
six-week fluency intervention. Upon completion of the fluency intervention, the
researcher would reassess those students to indicate if growth in reading
comprehension and fluency had been made, therefore determining if gains in
fluency caused achievement in comprehension. The use of a control group would
help to determine if the gains in fluency influenced the comprehension success
or if time and maturity accounted for the achievement in comprehension.
28
REFERENCES
Alvermann, D. E. & Montero, M. K. (2003). Literacy and Reading. In
Encyclopedia of Education (Vol. 4, pp. 1513-1518). New York: Macmillan.
Armbruster, B. B., Lehr, F., & Osborn, J. (2001). Put Reading First: The research
building blocks for teaching children to read: Kindergarten through grade 3.
Washington, DC: CIERA.
Bouchard, E., & Trabasso, T. (2003). Comprehension. In Encyclopedia of
Education (Vol. 6, pp. 1977-1985.). New York: Macmillan.
Brunsman, B. A. (2004). Review of DIBELS: Dynamic Indicators of Basic Early
Literacy Skills (6th ed). Mental Measurement Yearbook, 16. Retrieved
March 2, 2008, from http://www.unl.edu/buros/.
Bukowiecki, E. M. (2007). Teaching children how to read. Kappa Delta Pi
Record, 43, 58-65.
Chard, D. J., Vaughn, S., & Tyler, B. (2002). A synthesis of research on effective
interventions for building reading fluency with elementary students with
learning disabilities. Journal of Learning Disabilities, 35, 386-406.
Dougherty-Stahl, K. A. (2004). Proof, practice, and promise: Comprehension
strategy instruction in the primary grades. Reading Teacher, 57, 598-609.
Eldredge, J. L. (2005). Foundations of fluency: An exploration. Reading
Psychology, 26, 161-181.
Good, R. H., & Kaminski, R. A. (2002). DIBELS Oral Reading Fluency passages
for first through third grades (Technical Report No. 10). Eugene, OR:
University of Oregon.
29
Hudson, R. F., Lane, H. B., & Pullen, P. C. (2005). Reading fluency assessment
and instruction: What, why, and how? Reading Teacher, 58, 702-714.
Jenkins, J. R., Fuchs, L. S., van de Broek, P., Espin, C., & Deno, S. L. (2003).
Sources of individual differences in reading comprehension and reading
fluency. Journal of Educational Psychology, 95, 719-729.
Kolić-Vehovec, S., & Bajšanski, I. (2006). Metacognitive strategies and reading
comprehension in elementary-school students. European Journal of
Psychology of Education, 21, 439-451.
Kuhn, M. (2004). Helping students become accurate, expressive readers:
Fluency instruction for small groups. Reading Teacher, 58, 338-344.
Nation, K., & Angell, P. (2006). Learning to read and learning to comprehend.
London Review of Education, 4, 77-87.
O'Connor, R. E., White, A., & Swanson, H. L. (2007). Repeated reading versus
continuous reading: Influences on reading fluency and comprehension.
Exceptional Children, 74, 31-46.
Pardo, L. S. (2004). What every teacher needs to know about comprehension.
Reading Teacher, 58, 272-280.
Perfetti, C. A. (1985). Reading Ability. New York: Oxford University Press.
Perfetti, C. A. (1999). Cognitive research and the misconceptions of reading
education. In J. Oakhill & R. Beard (Eds.), Reading Development and the
Teaching of Reading: A Psychological Perspective (pp. 42-58). Malden,
MA: Blackwell Publishers.
30
Pikulski, J. J., & Chard, D. J. (2005). Fluency: Bridge between decoding and
reading comprehension. Reading Teacher, 58, 510-519.
Rasinski, T. V., & Padak, N.D. (1998). How elementary students referred for
compensatory reading instruction perform an school-based measures of
word recognition, fluency, and comprehension. Reading Psychology, 19,
185-216.
Rasinski, T. V. (2003). The Fluent Reader: Oral reading strategies for building
word recognition, fluency, and comprehension. New York: Scholastic
Professional Books.
Rasinski, T. V. (2006). Reading fluency instruction: Moving beyond accuracy,
automaticity, and prosody. The Reading Teacher, 59, 704-706.
Reutzel, D. R., & Hollingsworth, P. M. (1993). Effects of fluency training on
second graders' reading comprehension. Journal of Educational Research,
86, 325-331.
Samuels, S.J., & Flor, R.F. (1997). The importance of automaticity
for developing expertise in reading. Reading and Writing Quarterly:
Overcoming Learning Difficulties, 13, 107–121.
Shanahan, T. (2004). Review of DIBELS: Dynamic Indicators of Basic Early
Literacy Skills (6th ed). Mental Measurement Yearbook, 16. Retrieved
March 2, 2008, from http://www.unl.edu/buros/.
University of Oregon. (2003). Dynamic Indicators of Basic Early Literacy Skills
(DIBELS). Retrieved March 2, 2008, from http://dibels.uoregon.edu.
31
Walczyk, J. J., & Griffith-Ross, D. A. (2007). How important is reading skill
fluency for comprehension? Reading Teacher, 60, 560-569.
Wood, D. (2006). Modeling the relationship between oral reading fluency and
performance on a statewide reading test. Educational Assessment, 11, 85104.
32
Download