Running Head: PROMOTING LANGUAGE PROFICIENCY Fostering

advertisement
Running Head: PROMOTING LANGUAGE PROFICIENCY
Fostering Academic Engagement to Promote Proficiency in Second Language Learning
Sue Parler
Full Sail University
Author Note
Sue Parler, Educational Media Design and Technology, Full Sail University.
This project was conducted at DePaul Catholic High School, Wayne, NJ. Statistical
reports were furnished by Bernadette Wiltshire, Dean of Mathematics and Sciences, DePaul
Catholic High School, Wayne, NJ.
Correspondence concerning this article should be addressed to Sue Parler, Technology
Coordinator, DePaul Catholic High School, 1512 Alps Road, Wayne, NJ 07470. E-mail:
parlers@dpchs.org
1
Running Head: PROMOTING LANGUAGE PROFICIENCY
2
Abstract
This two-stage ARP used the wealth of available technology to develop multiple-intelligencebased, technology-rich, authentic assessment opportunities in second language acquisition for the
school’s lowest verbally skilled students as scored by the COOP entrance test.
During stage one, the assessment data, as scored by a panel of Spanish-speaking trained raters,
was placed on a numeric scale that correlated to the ACTFL standards. This data was compared
to a control group that participated in similar activities without benefit of technology and
formative assessment. Despite the target audience’s low aptitude in language and reading, the
results showed no significant difference in performance. We can efficiently conclude that the
effective use of technology can close the aptitude gap to positively affect speaking and listening
proficiency in a second language.
Stage two was more generative. Students were given creative freedom in a variety of exercises
through which they demonstrated their level of mastery of all four language-acquisition skills.
This freedom allowed them to take more ownership of their work. The study concluded that
these types of exercises increased the students’ rate of completion in non-on-demand
assignments. This combination of ownership and participation had an additional positive effect
in overall proficiency in the second language.
Running Head: PROMOTING LANGUAGE PROFICIENCY
3
Introduction
DePaul Catholic High School is a private co-educational secondary school, located in
Wayne, New Jersey, less than twenty miles from midtown New York City. Each of DePaul’s 850
students and 76 teaching staff members is issued a Tablet PCs and access to the school’s learning
management system, Blackboard. The school is a fully wireless environment. However, in a
school dripping with such cutting edge technology, there is a tendency to rely on the traditional
paper and pencil means of assessment. This leaves students generally disengaged from the
learning process; ignoring the fact that yesterday’s tried-and-true methods of teaching fall short
with today’s “NetGen” active learners (Matulich, 2008).
The positive effects of being an engaged, active learner far transcend academic
performance. Educators who effectively infuse technology into the curriculum enhance students’
self esteem and their independence, while also bolstering course participation (Jowallah, 2008).
The focus of this Action Research Project (ARP) is to use the available technology at
DePaul Catholic to develop alternate means of authentic assessment that encourage disengaged
students to become active participants in their own educational process.
Specific objectives. Upon the completion of Cycle 1, the students in Foundations of
Spanish 1 who have engaged in technology-assisted one-to-one opportunities to practice
listening and speaking skills, will better approximate proficient communication as measured by
the OPI and ACTFL standards than the students in the control group.
Upon the completion of Cycle 2, the students in Foundations of Spanish 1 will have
demonstrated competency in the following generative tools: Windows Live Movie Maker,
Audacity, PowerPoint, SMARTboard presentation, and Microsoft Publisher.
Running Head: PROMOTING LANGUAGE PROFICIENCY
4
Through the use of their technological competencies, the students in Foundations of
Spanish 1 will further demonstrate their proficient communication in Spanish as measured by the
ACTFL reading and writing standards.
Upon the conclusion of this ARP, students will have become more actively engaged in
their learning process, having demonstrated an increased level of participation as measured by a
quantitative assessment survey.
The Current Research
The connection between learning and academic engagement is unmistakably established
in current research literature. Walker, Greene, and Mansall (2006) recognized engagement was a
prerequisite for productive learning. Similarly, Shernoff, Csikszentmihalyi, Schneider, and
Shernoff (2003) found that within the scope of academics, engagement was linked to
achievement, motivation, and task persistence. Garris, Ahlers, and Driskell (2003) found a
symbiotic relationship between the extent of cognitive engagement and the retention of
information. Related findings established that the longer a student remains disengaged from his
immediate tasks, the more likely academic performance will suffer (Rock, 2004).
Students learn and retain more information when they actively participate in the process
of second language acquisition. This paper presents current research trends that use multiple
intelligence theory, computer technology, and authentic assessment as multi-faceted approaches
to engage and motivate students. Much of the literature presented here is broad-based research,
not necessarily focusing on second language (L2) learning.
The Theory of Multiple Intelligence (MI)
In redefining the established view of intelligence in 1983, Gardner profoundly impacted
conventional thinking and learning in education. Educators reflected on multiple intelligences in
Running Head: PROMOTING LANGUAGE PROFICIENCY
5
their classrooms, rather than cultivate the traditional verbal and mathematical intelligences so
readily measured by standardized tests. While the psychological community was slow to
embrace Gardner’s theory as fact, the classroom educators let out a long, collective sigh of relief;
something they had known all along was finally affirmed. The results of a 2008 study on the
topic of the effects of MI teaching strategies showed not only did standardized test scores
improve, but so did student comportment and parent involvement (Douglas, Burton, & ReeseDurham, 2008). In an effort to assist students to find personal meaning in their studies, thereby
greatly enhancing learning, Barrington (2004) encouraged teachers to create opportunities for
students to go beyond the traditional linguistic approach to knowledge.
Glendale Community College (GCC) in Glendale, Arizona was faced with students
exhibiting a lack of motivation and poor academic performance. In 1994, GCC piloted a twoyear, 10-class experiment. Students were permitted to choose a creative option to demonstrate
their understanding. These options were based in Gardner’s eight intelligences. Ten years later
with the program vastly expanded, present and past students were surveyed and found to have
shown increased motivation, longer retention, and the ability to take something they learned and
apply it to a new situation (Díaz-Lefebvre, 2006). In GCC’s single study specific to second
language acquisition, a test measuring the preferred intelligence was administered to first- and
fourth-year students. The researcher compared the results and found some similarities in the
interpersonal intelligence. He concluded, without any basis for judgment, that applying MI
theory enriched both the students and teacher. He further stated that using MI theory offered a
variety of teaching/learning options (İşısağ, 2008) Unfortunately, based upon this unfounded
conclusion, this GCC research, albeit the single piece directed to L2 acquisition, can be deemed
as neither reliable nor valid.
Running Head: PROMOTING LANGUAGE PROFICIENCY
6
Technology as a Tool for Engagement
The term “technology” encompasses everything from an overhead projector to hand-held
response systems. For the purposes of this paper, the term technology is used to mean Internetaccessible computers loaded with academically appropriate software. Faced with disengaged
students, the University of Wolverhampton, launched a program aimed to help students become
actively involved in their own learning. The program consisted of an online interactive module
which was comprised of four tasks. Leadership took into consideration the educational theories
of Bloom, Piaget, Vygotsky, and others in creating these tasks. Based on the results of surveys
and other assessments, the study concluded that technology-based learning increased
participation within a course, thereby developing students into active learners (Jowallah, 2008).
The author noted that it was not necessarily the technology in isolation, but rather the students’
ability to interact with the technology that was integral to effective motivation in computer
game-based learning. Matulich (2008) added that participating in games often resulted in a
significant increase in student performance.
With so many studies testifying to the success of using technology and games as both
motivation and performance enhancement, why isn’t the use of technology more prevalent
today? Unfortunately, the pressure to produce tangible evidence of success placed inordinate
emphasis on standardized testing (Brantley-Dias, Calandra, Harmon, & Shoffner, 2006). When
the government budgeted in the range of $24 billion dollars for the NCLB Act in 2007, it
expected measurable outcomes and looked to standardized tests. Such emphasis undermined the
potential promise of technology as a viable teaching and learning tool (Schneiderman, 2004).
Running Head: PROMOTING LANGUAGE PROFICIENCY
7
Authentic Assessment Provides Significance
Gulikers, Bastiaens, Kirschner, and Kester (2008) skillfully crafted an argument that
concluded the term “authentic assessment” is in the eye of the beholder. Given that, it behooves
this review to define the term used within as “authentic assessment exercises are similar to realworld tasks that would be expected by a professional” (Chung & Behan, 2010, p. 24). Hill and
Smith (2005) tracked 12 students in three different technological education programs in three
different schools over a three-year period. They concluded that the students viewed what was
learned in life and in school as inseparable because of the significant relationship between the
two. They further concluded that the authentic situations provided a richer experience.
Özdener and Özçoban (2004) studied two different groups of students in computer
classes. They combined project-based learning and MI theory in one group and employed direct
instruction in the other. Not surprisingly, they came to the following conclusions: Project-based
teaching is more effective; students had the opportunity to use higher order thinking skills such
as problem solving and creativity in the problem-based group, which made for more permanent
and effective learning as a result. Other research has determined the connection between
authentic tasks, engaged learners and technology to be synergistic in nature, each fueling the
success of the other (Herrington, 2006).
Newsome Park, a K-5 Science, Mathematics, and Technology magnet school in Virginia,
equipped with a large number of wireless laptop computers, making for pervasive availability of
technology provided the setting for a four-year study. The school had adopted a project-based
curriculum in academic year 1999-2000, effectively changing how the curriculum was delivered
and organized. A research team measuring the success of the project-based program in 2003
concluded the following: a range of student competencies improved; there was an increase in
Running Head: PROMOTING LANGUAGE PROFICIENCY
8
oral communication skills; collaboration improved; technology proficiency improved; student
attitudes towards learning improved; students gained a deeper understanding of material; and last
but not least, motivation increased (Anderson & Dexter, 2003). Authentic assessment with
integrated technology in L2 language acquisition has been limited to projects geared toward
proficiency assessment. The Computerized Oral Proficiency Instrument (COPI) is one such
project. It is purely a scoring tool. Rosetta Stone® is a very popular language learning software,
however the only endorsement of the product comes in the form of testimonials from its users,
not scholarly research.
Conclusion
It is evident that computer technology in MI-based, authentic assessment settings
meaningfully engages the student, thereby increasing overall achievement. The three subtopics
presented in this study should be viewed as an amalgamation, as it is the synergy created by their
co-existence that was addressed repeatedly by many of the resources offered here.
Grossly overlooked in current research is the impact on L2 acquisition. Current L2
computer technology focuses on scoring indices rather than instructional and/or learning
facilitators. It is clear from the literature available that further study is warranted within the scope
of computer technology and MI-based authentic assessment. Future research should not only
provide examples of use, but present empirical evidence of its value when used to re-tool the
learning environment. Furthermore, it should show a direct correlation between its use and
impact on student achievement.
Running Head: PROMOTING LANGUAGE PROFICIENCY
9
Instructional Design Constraints
Time. The project was broken into two distinct time intervals heretofore known as
cycles. Cycle 1 ran in three sets of two weeks for a total of six consecutive weeks. Cycle 2 will
ran an additional six weeks.
Subject or content. During the first cycle, students used Blackboard’s Wimba tools to
practice speaking and listening skills in the target language. During Cycle 2, students were
challenged and engaged by audio-visual generative technologies designed to provide practice in
all four language acquisition skills; speaking, listening, reading, and writing.
Content type. During Cycle 1, the teacher took responsibility for structuring the learning
environment. The students’ tasks were explicitly prescribed, thereby yielding generally
predictable outcomes. The tasks were narrow in focus, encompassing only two of the four
language acquisition skills: listening and speaking. For these reasons, along the supplantivegenerative continuum, Cycle 1 was more supplantive in nature.
By contrast, Cycle 2 was more generative. Students were given creative freedom in a
variety of exercises through which they demonstrated their level of mastery of all four languageacquisition skills. The freedom afforded the students in this cycle allowed them to take more
ownership in designing their personal learning environment.
Instructional theory or outcome. Kemp et al. (2006) noted that past instructional plans
have centered on teaching rather than learning. These plans were often ambiguous, subjective,
and based largely in intuition. They laid out a holistic plan to move from “hoped-for” student
success to genuine, measurable student learning. They noted that no matter how good the
planning was, no plan could effect change if it were used in a stringent traditional structure.
Running Head: PROMOTING LANGUAGE PROFICIENCY
10
Generally, the instructional theory employed throughout this ARP was based in the
constructivist principles. Constructivists emphasize learning through authentic activities, often
through social activity. American psychologist Carl Rogers noted that learning required dealing
with the whole person -- affectively and cognitively. He saw learning as coming from within the
learner. The learner in Roger’s scenario learns for learning’s sake. It changes the learner – it
engages the learner holistically.
Vygotsky viewed the student and teacher as equal participants in the co-construction of
the child’s knowledge (Bodrova, 2003). Working within this view, the teacher arms the student
with tools that the student applies first contained by and later transcending the initial scope. This
constructivist view promoted innovative activities aimed to engage students in taking command
of their own learning assisted by technology.
Gagne’s learning domains. This ARP encompassed several of Gagne’s domains. The
first domain addressed in Cycle 1 was verbal information. The student learned how to
communicate on a rudimentary level in the target language. In Cycle 2, the student continued this
verbal information acquisition through the authentic use of the target language.
The second domain involved cognitive strategies. In Cycle 2, students were given ample
opportunity to put their rudimentary skills to the test in heretofore unrehearsed scenarios. The
students were continually challenged to think in the target language.
Lastly, the domain of attitudes was addressed with one of the project’s goals having been
student engagement. This required not only a paradigm shift from teacher-centered to studentcentered learning, but a change in attitude of the once-disengaged learner.
Running Head: PROMOTING LANGUAGE PROFICIENCY
11
Method
Target Audience
Table 1
Participant Characteristics
Student
Age
Home Town
Race
COOP Lang
COOP Read
T1
14.29
Paterson
Black
26
38
T2
14.53
Paterson
Black
16
65
T3
15.03
Wayne
White
16
20
T4
15.35
Paterson
White
38
29
T5
14.87
Totowa
White
25
41
T6
15.11
Lincoln Park
White
44
48
T7
14.60
West Orange
White
46
29
T8
14.80
North Haledon
White
32
47
T9
15.75
Pompton Lakes
White
6
39
T10
15.42
Paterson
Hispanic
32
20
T11
15.42
Paterson
Hispanic
32
15
Target Avg.
35.55
28.45
Grade 9 Avg.
68
67.93
The Foundations of Spanish 1 class is comprised of 11 freshmen. Its mean reading score
falls within the eighth percentile of the school’s grade nine population; the mean language score
falls within the fourth percentile of the school’s grade nine population. These scores were
measured by the 2009 Cooperative Admissions Entrance Examination (COOP). Specifically, the
Running Head: PROMOTING LANGUAGE PROFICIENCY
12
group is comprised of one female and 10 male students. As of November 2010, the class ranges
in age from 14.29 to 15.75 years, with the mean age at 15.01 years. Other conceivably pertinent
data is as follows: two Black, seven White, and two Peruvian; four of these students live in urban
areas, seven in suburban areas.
The Foundations of Spanish 1 class is comprised of 11 freshmen, representing the lowest 15% of
reading-skilled students in Grade 9 as measured by the 2009 Cooperative Admissions Entrance
Examination (COOP). Specifically, the group is comprised of one female and 10 male students.
As of November 2010, the class ranges in age from 14.28 to 15.75 years, with the mean age at
15.01 years. Other conceivably pertinent data is as follows: two Black, seven White, and two
Peruvian; four of these students live in urban areas, seven in suburban areas.
The most relevant data to this ARP is their COOP scores. The COOP provides objective
information by measuring norm-referenced academic achievement in reading and language. The
students took this test in November 2009.
The data gathered upon the completion of Cycle 1 was compared to another group of
Spanish I students. The comparative demographics are as follows:
Control Group
Table 2
Participant Characteristics
Student
Age
Home Town
Race
COOP Lang
COOP Read
C1
14.88
Paterson
Hispanic
93
84
C2
14.78
Budd Lake
White
85
82
C3
14.17
Wayne
White
79
84
C4
14.04
Totowa
White
77
76
Running Head: PROMOTING LANGUAGE PROFICIENCY
13
C5
14.73
Bloomingdale
White
88
91
C6
14.10
Pompton Lakes
White
61
97
C7
14.73
West Caldwell
White
86
68
C8
14.81
Little Falls
White
92
99
C9
15.05
Bloomingdale
White
91
87
C10
14.77
Wayne
Black
73
78
C11
14.26
Paterson
Black
83
87
Control Avg.
82.55
84.81
Grade 9 Avg.
68
67.93
Required Resources
Students participating in the ARP needed the following technology equipment: computers
with Internet access, microphones, and webcams. They also needed the following software:
Audacity, Microsoft Office, and Windows Live Movie Maker. Furthermore, they needed the
following access: Blackboard and its Wimba toolset and Viddler to post video.
To measure the outcome of Cycle 1, this ARP called for a control group of Spanish 1
students who do the same speaking-listening exercises without benefit of the Wimba voice
boards. To complete the scoring needed for these measurements, this ARP called for a rating
board of impartial Spanish-speakers to compare the videotaped and/or recorded messages of the
Foundations of Spanish 1 students with those of the control group.
This ARP also required several scoring indices and/or surveys. A quantitative scoring
index designed to measure the students’ level of participation in class was needed prior to the
implementation of Cycle 1. The American Council of Teachers of Foreign Language (ACTFL)
Running Head: PROMOTING LANGUAGE PROFICIENCY
14
Proficiency Guidelines and the Oral Proficiency Interview (OPI) Guidelines were used to
quantitatively and qualitatively measure the outcomes in Cycle 1. Based on the ACTFL
standards in reading and writing, a quantitative and qualitative scoring matrix was developed to
score student work in Cycle 2.
Permissions
The target audience, although minors, were involved in an every day educational process;
no additional permission was required. While parents/guardians have submitted permission to
use photos or video for an academic purpose, the commercial generated for the city project
required an additional release for intellectual property to publish the video. Both the student and
parent or guardian needed to grant this permission.
The premise of this ARP was submitted to the Dean of the World Language Department.
The Administration of DePaul Catholic High School also granted permission to proceed.
Other Constraints
A Tablet PC, equipped with a webcam and microphone, and all required software had
been issued to every student and staff member at DePaul Catholic HS. Each student had a
Blackboard account with built-in Wimba tools – including voice board. The availability of
technology was not an issue of concern.
An unforeseen infrastructure failure, such as loss of Internet access for a day or two,
would have resulted in an adjustment of the length of the cycle. Any inclement weather
conditions that arose prohibiting attendance to class was remedied by sliding the participation
dates. This ARP was linear in terms of material, but not necessarily date-restricted. There were
no constraints for which there was not an easy remedy.
Activities
Running Head: PROMOTING LANGUAGE PROFICIENCY
15
During Cycle 1, the students were engaged in extensive Wimba voice board sessions. It
was not a technology with which they were unfamiliar. In addition to capturing voice recordings,
Wimba voice boards allowed the instructor to leave recorded feedback for each individual
student. It was this one-to-one aspect that allowed for a more fluid conversational pace.
Authentic assessment is a form of assessment in which students are asked to perform
real-world tasks that demonstrate meaningful application of essential knowledge and skills
(Mueller, 2008). The goal of any language acquisition student, child or adult, is to be able to
communicate with other same-language speakers. During Cycle 2, the students were engaged in
real-world-engineered tasks in which not only were their listening and speaking skills put to the
test, but their reading and writing skills as well.
Weeks 1 and 2. Students entered the Wimba voice board a total of five times; each time
answering two conversational questions and formulating two questions to sustain a conversation.
Students felt prepared, as this exercise was an extension of the vocabulary and dialogs previously
covered in class.
Weeks 3 and 4. Students entered the Wimba voice board a total of four times. The
benchmarks were elevated in those two weeks. Students were given a series of questions on
which they may have taken cursory notes. They were asked to formulate a response within 90
seconds. They should have spoken for 45 seconds. This was a prolonged recording done in the
first-person singular. The student were already familiar with the vocabulary.
Weeks 5 and 6. Students entered the Wimba voice board a total of three times. They
were given a storyboard and asked to describe what they saw, addressing each section, thereby
narrating a coherent story. They were to have two minutes to prepare and were required to speak
for at least one minute.
Running Head: PROMOTING LANGUAGE PROFICIENCY
16
Cycle 2 requires a bit of background information on an on-going game theme that was
currently in effect: Desafío. This was a challenge-based game in which the class had been broken
into four guilds: The Lee Tribe, representing Panama; The Chibcha, representing Venezuela, The
Kirchners, representing Argentina; and The Resistance, representing Paraguay. The Challenge
(Desafío) portion of the journey comes with every assignment that involves a rubric. The winner
of a quick-fire competition may have challenged any other team to a “Meet the Rubric”
competition. The team challenged presented its work and defended its adherence to the rubric. If
the challenged team defended, the match points (outlined in the rubric) would have gone to the
defenders. If the defense of the rubric failed, the challengers gained all points.
The ultimate goal of each team involved in Desafío was to be the number one guild at the
end of the year. Pedagogically, the goal was to create a team-based synergy where each learner
was responsible not only to himself, but to his guild as well.
In a corner of the Foundations of Spanish 1 classroom, there was area adorned with the
guild-generated crests and current leader board. In several of the activities that follow, Desafío
will be noted as integral to the activity.
Phase 1 of Cycle 2. Food and beverage vocabulary had been incorporated into Weeks 5
and 6 of Cycle 1. Cycle 2 began with the same basic content in an authentically-engineered
setting. Within their guild, students were asked to act as a new restaurateur by creating a menu in
MS Publisher. The rubric, which was issued at the onset of this three-day project, outlined the
individual guild members’ participation requirements. Each group member must have
contributed equally as the guild divided the menu into the three main meals of the day: breakfast,
lunch, and dinner. This menu included text in the target language as well as aesthetic elements.
Upon completion of the menu, students were also required to create a podcast commercial for a
Running Head: PROMOTING LANGUAGE PROFICIENCY
17
local radio station. The culminating activity was Desafío: a commercial rubric challenge. The
two teams not directly involved in the challenge were the official judges of the competition and
scored the commercial according to the rubric. If the commercial was judged as superior, the
challenge was met.
Phase 2 of Cycle 2. One of the verb structures covered during this phase was “ir a +
infinitve”, which translates into “going to + verb”. The names of each student were placed into a
bowl five times. Every student pulled five names, rejecting duplicates and the student’s own
name. This method assured each student be given an equal opportunity to participate.
Once the names were drawn, each student created one PowerPoint slide per name drawn
depicting what that student “is going to being doing” in the year 2020. They may have used
images and music from the web. The slide was constructed using the drawn student’s name as
the first word of an ir a + infinitive sentence. The slide creator then placed a rectangle over the
student’s name as a mask. The slide creator projected his or her work on a SMARTboard asking
the class to guess to whom the sentence referred. Students were summatively assessed based on
the grammatical correctness of the sentence.
Phase 3 of Cycle 2. Students would have completed extensive study of the Spanishspeaking world in terms of culture and geography by the time this phase started. Each student
would have already presented this research on one previously assigned country. The vocabulary
topic covered in this next assessment was city vocabulary.
Students pretended to be a travel agent attempting to create tourist interest in a city of his or her
choice located within the country on which previous research was conducted. The travel agent
must have accomplished two tasks: 1. Create a poster or travel brochure to inspire interest, and 2.
Running Head: PROMOTING LANGUAGE PROFICIENCY
18
Create a one-minute movie to be used as a TV commercial, narrated in Spanish. Students
uploaded their commercials to the class Viddler account and presented them to the class.
Portions of six class days were devoted to this multi-faceted activity. This was a rubric-based
activity, therefore making it eligible for Desafío.
Phase 4 of Cycle 2. During this phase, family vocabulary was presented to the students.
Using online Web 2.0 mind-mapping software, created their own family tree. The Spanish
vocabulary identifying each familial relationship was required as labels on the tree. They were
also expected to write a well-constructed Spanish sentence that described that person in terms of
the family relationship. They took a screenshot of the tree and placed it into a one-slide
PowerPoint with a 15-second Spanish voice recording as an introduction.
Honoring the possible differences in family structures and the privacy of the students, these
family trees were not presented to the class. There was little class time devoted to this phase. The
satisfactory completion of the assignment served as further testimony to the students’
engagement as an active learner.
In each of the Cycle 2 phases described above, technology was incorporated. All four
language acquisition skills were required in each phase.
Results
Validity & Evaluation Data
Comparison. The general inquiry during Cycle 1 was to determine if the combination of
technology – specifically Wimba voice board – and individual feedback could positively affect
speaking and listening proficiency in the target language. The non-Wimba student control group
was receiving feedback in the classroom and engaged in similar, if not identical exercises,
without the benefits afforded by Wimba.
Running Head: PROMOTING LANGUAGE PROFICIENCY
19
To determine the effectiveness, a measurement tool was devised. Spoken language
proficiency was generally measured through accuracy and fluency. A panel of three teachers of
Spanish assessed the proficiency level of the students involved in this project. Each of those
teachers had an individual tendency in scoring. To attempt to create a consistency, the teachers
scored several sample recordings as individuals; compared the results; talked through any
discrepancies; and derived a universal scoring model. This exercise is known as achieving interrater reliability in Applied Linguistics.
Some of the data collected was viewed quantitatively. Errors in agreement (nounverb/article-noun/noun-adjective/plural-singular), syntax, vocabulary, and pronunciation were
tallied. Some of the data such as the overall fluency of the speaker and the speaker’s ability to
embellish or exceed the minimum verbiage was scored qualitatively.
The quantitative data underwent a statistical challenge called a t-Test to determine if the
mean data tallies from each group were statistically different. This was a necessary step to
guarantee a valid comparison. The t-Test yielded a p-Value, which determined which data was
within what is known as the critical region. In other words, which differences in the mean data
collection were significant. A second statistical inquiry, known as the Two Sample t Confidence
Interval test accurately inferred whether these results could be duplicated on a larger scale.
The qualitative data, as scored by our panel of Spanish-speaking trained raters, was
placed on a numeric scale that correlates to the ACTFL standards. That qualitative data has a
numeric equivalent and a similar statistical test was performed.
Running Head: PROMOTING LANGUAGE PROFICIENCY
20
Table 3
Two Sample t-Test
Target Group
Ctrl Group
95% CI
M
SD
M
SD
t
p
df
LL
UL
Content
2.850
0.997
2.93
0.752
-0.221
0.827
18.863
-0.8553
0.6917
Comprehension
2.700
1.011
2.573
0.790
0.329
0.746
18.90
-0.6827
0.9373
Comprehensibility
2.480
1.090
1.945
0.362
1.549
0.147
12.175
-0.2167
1.2895
Accuracy
1.582
0.086
2.591
0.580
-3.226
.0048
17.545
-1.668
-0.351
Fluency
2.773
1.193
2.636
0.765
0.319
0.754
17.025
-0.7651
1.0379
Average
2.48
0.997
2.54
0.592
0.156
0.878
16.260
-0.7952
0.6808
Note: M = Mean; SD = Standard Deviation; t = t-Score; p = p-Score; df = Degrees of Freedom;
CI = Confidence Interval
Conclusions
Due to the high p-value of 0.827, the data are not statistically significant. There is no
evidence against the null hypothesis. Therefore, we fail to reject the null hypothesis and we
conclude that there is no difference in achievement in "Content" between the two groups studied.
Due to the high p-value of 0.746, the data are not statistically significant. There is no
evidence against the null hypothesis. Therefore, we fail to reject the null hypothesis and we
conclude that there is no difference in achievement in "Comprehension" between the two groups
studied.
Running Head: PROMOTING LANGUAGE PROFICIENCY
21
Due to the high p-value of 0.147, the data are not statistically significant. There is no
evidence against the null hypothesis. Therefore, we fail to reject the null hypothesis and we
conclude that there is no difference in achievement in "Comprehensibility" between the two
groups studied.
Due to the very low p-value of 0.0048, the data are statistically significant. There is great
evidence against the null hypothesis. Therefore, we reject the null hypothesis and we conclude
that there is a difference in achievement in "Accuracy" between the two groups studied.
Due to the high p-value of 0.754, the data are not statistically significant. There is no
evidence against the null hypothesis. Therefore, we fail to reject the null hypothesis and we
conclude that there is no difference in achievement in "Fluency" between the two groups studied.
Due to the high p-value of 0.746, the data are not statistically significant. There is no
evidence against the null hypothesis. Therefore, we fail to reject the null hypothesis and we
conclude that there is no difference in the overall results between the two groups studied.
The 95% Confidence Interval tells us that if we repeated the samples, we have a 95% chance of
getting the same result.
The data collected in Cycle 1 was used both formatively and summatively. Formatively,
the feedback that each student received as a result of his/her recording was meant to teach the
student how to improve – it was assessment for learning. Summatively, this same data collected
from the two groups – Wimba users and the control group – was used to determine the validity of
using Wimba’s voice board as a tool to improve listening and speaking skills in second language
acquisition.
The generative nature of Cycle 2 shifted the assessment focus to a more tangible nature
on a daily basis. By producing original content utilizing the targeted technologies, students
Running Head: PROMOTING LANGUAGE PROFICIENCY
22
overtly demonstrated their competency in the technology. The qualitative scoring index designed
to measure the students’ level of participation in class prior to the implementation of Cycle 1 was
again scored at the conclusion of Cycle 2.
Criteria 1: Completion of Non-On-Demand Assignment
For the purposes of this study, “non-on-demand assignments” was defined as work done
outside face-to-face class time, in other words, homework. The baseline measurement of the
target audience’s rate of completion of non-on-demand assignments in Foundations of Spanish 1
was done at the conclusion of Quarter 1 (November 2010). It was measured again at the
conclusion of Cycle 2. The table below contains a summary of the data:
Table 4
Non-on-demand Assignment Completion Rate
Student
Baseline Rate
Conclusion of C2
Difference
T1
25%
87%
+62
T2
75%
100%
+25
T3
12%
87%
+72
T4
25%
75%
+50
T5
50%
75%
+25
T6
12%
75%
+63
T7
50%
100%
+50
T8
50%
100%
+50
T9
32%
60%
+28
T10
80%
100%
+20
T11
80%
100%
+20
Running Head: PROMOTING LANGUAGE PROFICIENCY
23
Criteria 2: Technological Competencies
Every student was able to show competency with Windows Live Movie Maker,
Audacity, PowerPoint, SMARTboard presentation, and Microsoft Publisher.
Criteria 3: ACTFL Reading and Writing Proficiencies
These results are measured by summatively assessing the written skills of the students as
demonstrated by the work products collected throughout this Cycle 2. In addition, I thought it
would be interesting to note how all of this translates into grades. One would assume a
correlation between the volume of work, the increased proficiency and the summative grade
assigned. The table below is a collection of that data:
Table 5
Traditional Quarterly Assessment Differences
Student
Scaled Proficiency
Quarter 1 Grade
Quarter 3 Grade
T1
2.3
D+
B-
T2
3.7
B
B+
T3
3.3
D+
C-
T4
3.1
B
B+
T5
2.3
B
D+
T6
3.3
D
C+
T7
1.7
C+
B
T8
2.3
B
C+
T9
3
C-
C+
T10
5
C-
B+
T11
4.7
B
B+
Running Head: PROMOTING LANGUAGE PROFICIENCY
24
As a side note, and not meant to obfuscate the data, the two students whose grades
declined from Quarter 1 to Quarter 3 had taken vacations during school time. Student T5 was not
present during Phase 1 of Cycle 2. Student T8 missed the first week of Phase 1.
Conclusion
The most obvious insight is in the correlation of engagement and achievement as
measured by traditional Quarterly assessments. Based on the data, it became essential to question
the students as to why there was such a marked improvement in their completion of non-ondemand assignments. Was it the use of technology that they found so engaging? Or did the fact
that in two of the four assignments they were working in cohorts and they felt a responsibility to
their team to do well?
Without question, the data derived from this Action Research Project will add to the
existing body of literature. It has shown a clear correlation between academic engagement and
achievement - specifically with regard to second-language acquisition.
Running Head: PROMOTING LANGUAGE PROFICIENCY
25
References
Anderson, R., & Dexter, S. (2003). Newsome Park Elementary: Making learning meaningful
through project-based learning using wireless laptops in a K-5 Math, Science, and
Technology magnet school. Case report from the U.S.A. Exemplary TechnologySupported Schooling Case Studies Project. Retrieved on November 14, 2010 from
http://edtechcases.info/schools/newsome/ newsome.htm.
Barrington, E. (2004). Teaching to student diversity in higher education: how Multiple
Intelligence Theory can help. Teaching in Higher Education, 9(4), 421-434. Academic
Search Premier. doi: 10.1080/1356251042000252363
Boyd, N., Williams-Black, T., Love, F. (2009). Using real-world connections to engage graduate
students in authentic reading and writing. Reading Improvement, 46(4), 238-246.
Retrieved from Academic Search Premier database.
Brantley-Dias, L., Calandra, B., Harmon, S., Shoffner, M. (2006). An analysis of collaboration
between Colleges of Education and Arts & Sciences in PT3. TechTrends, 50(3), 32–37.
doi: 10.1007/s11528-006-7601-2
Chan Bee, C. (2007). Activity-based approach to authentic learning in a vocational institute.
Educational Media International, 44(3), 185-205. doi:10.1080/09523980701491633
Chien-Hsun, C., Chuen-Tsai, S., & Jilung, H. (2008). Player guild dynamics and evolution in
massively multiplayer online games. CyberPsychology & Behavior, 11(3), 293-301.
doi:10.1089/cpb.2007.0066
Childress, M., & Braswell, R. (2006). Using massively multiplayer online role‐ playing games
for online learning. Distance Education, 27(2), 187-196.
doi:10.1080/01587910600789522
Running Head: PROMOTING LANGUAGE PROFICIENCY
26
Chung, H., & Behan, K. (2010). Peer sharing facilitates the effect of inquiry-based projects on
science learning. American Biology Teacher, 72(1), 24-29. doi:10.1525/abt.2010.72.1.7
DeCastro-Ambrosetti, D., & Cho, G. (2005). Synergism in learning: a critical reflection of
authentic assessment. High School Journal, 89(1), 57-62. Retrieved from Academic
Search Premier database
Díaz-Lefebvre, R. (2006). Learning for understanding: A faculty-driven paradigm shift in
learning, imaginative teaching, and creative assessment. Community College Journal of
Research & Practice, 30(2), 135-137. doi:10.1080/10668920500433082
Douglas, O., Burton, K., & Reese-Durham, N. (2008). The effects of the multiple intelligence
teaching strategy on the academic achievement of eighth grade math students. Journal of
Instructional Psychology, 35(2), 182-187. Retrieved from Academic Search Premier
database.
Fox-Turnbull, W. (2006). The influences of teacher knowledge and authentic formative
assessment on student learning in technology education. International Journal of
Technology & Design Education, 16(1), 53-77. doi:10.1007/s10798-005-2109-1.
Garris, R., Ahlers, R., & Driskell, J. (2003). Games, motivation, and learning: A research and
practice model. Simulation and Gaming, 33, 441–467. doi: 10.1177/1046878102238607
Gulikers, J., Bastiaens, T., Kirschner, P., & Kester, L. (2008). Authenticity is in the eye of the
beholder: Student and teacher perceptions of assessment authenticity. Journal of
Vocational Education & Training, 60(4), 401-412. doi:10.1080/13636820802591830
Herrington, J., & Kervin, L. (2007). Authentic learning supported by technology: Ten
suggestions and cases of integration in classrooms. Educational Media International,
44(3), 219-236. doi:10.1080/09523980701491666
Running Head: PROMOTING LANGUAGE PROFICIENCY
27
Herrington, J., Reeves, T., & Oliver, R. (2006). Authentic tasks online: A synergy among
learner, task, and technology. Distance Education, 27(2), 233-247.
doi:10.1080/01587910600789639
Hill, A., & Smith, H. (2005). Research in purpose and value for the study of technology in
secondary schools: A theory of authentic learning. International Journal of Technology &
Design Education, 15(1), 19-32. doi:10.1007/s10798-004-6195-2
Hoffman, B., & Nadelson, L. (2010). Motivational engagement and video gaming: A mixed
methods study. Educational Technology Research & Development, 58(3), 245-270.
doi:10.1007/s11423-009-9134-9
İşısağ, K. (2008). Implementing multiple intelligences theory in foreign language teaching. Ekev
Academic Review, 12(35), 351-362. Retrieved from Academic Search Premier database.
Jonassen, D. H. (1994). Technology as cognitive tools: Learners as designers [electronic
version]. ITForum. Retrieved September 15, 2010, from
http://itech1.coe.uga.edu/itforum/paper1/paper1.html.
Jowallah, R. (2008). Using technology supported learning to develop active learning in higher
education: A case study. US-China Education Review, 5(12), 42-46. Retrieved from
Education Research Complete database.
Judson, E. (2010). Improving technology literacy: Does it open doors to traditional content?.
Educational Technology Research & Development, 58(3), 271-284. doi:10.1007/s11423009-9135-8
Ke, F. (2008). Computer games application within alternative classroom goal structures:
Cognitive, metacognitive, and affective evaluation. Educational Technology Research
and Development, 56(5-6), 539-556. doi: 10.1007/s11423-008-9086-5
Running Head: PROMOTING LANGUAGE PROFICIENCY
28
Knotts, G., Henderson, L., Davidson, R., & Swain, J. (2009). The search for authentic practice
across the disciplinary divide. College Teaching, 57(4), 188-196.
doi:10.3200/CTCH.57.4.188-196
Koh, K., & Luke, A. (2009). Authentic and conventional assessment in Singapore schools: An
empirical study of teacher assignments and student work. Assessment in Education:
Principles, Policy & Practice, 16(3), 291-318. doi:10.1080/09695940903319703
Matulich, E., Papp, R., & Haytko, D. (2008). Continuous improvement through teaching
innovations: A requirement for today's learners. Marketing Education Review, 18(1), 1-7.
Retrieved from Business Source Premier database.
Özdener, N., & Özçoban, T. (2004). A project based learning model's effectiveness on computer
courses and multiple intelligence theory. Educational Sciences: Theory & Practice, 4(1),
176-180. Retrieved from Academic Search Premier database.
Ranalli, J. (2008). Learning English with The Sims: Exploiting authentic computer simulation
games for L2 learning. Computer Assisted Language Learning, 21(5), 441-455.
doi:10.1080/09588220802447859
Rock, M. (2004). Transfiguring it out. Teaching Exceptional Children, 36(5), 64-72. Retrieved
from Academic Search Premier database.
Schneiderman, M. (2004). What does SBR mean for education technology? THE Journal,
31(11), 30–36. Retrieved from http://thejournal.com/articles/2004/06/01/what-does-sbrmean-for-education-technology.aspx.
Shernoff, D., Csikszentmihalyi, M., Schneider, B., & Shernoff, E. (2003). Student engagement in
high school classrooms from the perspective of flow theory. School Psychology
Quarterly, 2, 158–176. Retrieved from ERIC database.
Running Head: PROMOTING LANGUAGE PROFICIENCY
29
Slepkov, H. (2008). Teacher Professional Growth in an Authentic Learning Environment.
Journal of Research on Technology in Education, 41(1), 85-111. Retrieved from
Academic Search Premier database.
Stein, S., Isaacs, G., & Andrews, T. (2004). Incorporating authentic learning experiences within
a university course. Studies in Higher Education, 29(2), 239-258.
doi:10.1080/0307507042000190813
van Gog, T., Sluijsmans, D., Joosten-ten Brinke, D., & Prins, F. (2010). Formative assessment in
an online learning environment to support flexible on-the-job learning in complex
professional domains. Educational Technology Research & Development, 58(3), 311324. doi:10.1007/s11423-008-9099-0
Walker, C., Greene, B, & Mansall, R. (2006). Identification with academics, intrinsic/extrinsic
motivation, and self-efficacy as predictors of cognitive engagement. Learning and
Individual Differences, 16, 1–12. doi:10.1016/j.lindif.2005.06.004
Download