Uploaded by Sharareh Gharehbaghi

Manuscript

advertisement
Using ScratchJr to foster young children’s computational thinking
competence: A case study in the third-grade computer class
Pao-Nan Chou
Center for Teacher Education, National Pingtung University of Science and Technology
pnchou@mail.npust.edu.tw
Abstract
This study investigated young children’s computational thinking (CT) development by integrating
ScratchJr into a programming curriculum. Twelve third graders (six males and six females) voluntarily
participated in an experiment-based computer class conducted at a public elementary school in Taiwan.
This study adopted a case study methodology to investigate research questions in one specific case (8week CT educational training). A one-group quasi-experimental pretest and posttest design with the
support of qualitative observation was used to examine four research topics: CT competence progress,
programming behaviors in a CT framework, factors influencing CT competence, and learning
responses to CT training. The quantitative results indicated that students immersing in weekly
programming projects significantly improved in terms of their CT competence, which was mostly
retained one month after completion of the class. The programming behaviors indicated that students’
CT concepts (sequence, event, and parallelism) and practice (testing and debugging as well as reusing
and remixing) significantly improved. Moreover, parents’ active involvement in take-home
assignments influenced students’ long-term CT competence retention. The qualitative results indicated
that students enjoyed using tablet computers to learn ScratchJr programming and demonstrated various
leaning behaviors in a three-stage instructional design model.
Keywords: Computational thinking, Visual programming language, Case study, Learning to program,
Programming behavior, Coding literacy
1. Introduction
ScratchJr is a newly developed visual programming language specifically designed for young
children (under 8 years of age) to foster basic computational thinking (CT) competence (Strawhacker,
Lee, & Bers, 2018). This programming tool (as an application) installed on tablet computers is a
fundamental version of Scratch (Resnick et al., 2009). Scratch and ScratchJr share similar coding
learning environments; however, ScratchJr does not include some advanced features such as variable
creation and math computing. Although Falloon (2016) and Strawhacker et al. (2018) have
demonstrated some learning benefits of ScratchJr for CT instruction, no scientific evidence regarding
the effect of integrating ScratchJr into regular classes, particularly for CT competence development, is
available.
This study reports findings regarding the use of ScratchJr for CT instruction in a computer class
where elementary school students engaged in various programming activities over a single semester. A
simplified version of the CT framework developed by Brennan and Resnick (2012) was used to
document young children’s programming learning behaviors for further analysis. Instead of using large
quantitative data for learning generalization, the present study analyzed one case scenario where 12
third graders voluntarily participated in a semester-long educational experiment. The purpose of this
study was to determine the instructional effectiveness of incorporating ScratchJr into a programming
curriculum for elementary school students. Specifically, the four research objectives were to answer the
following questions:
RQ1: What CT competence progress did the students make after one semester of instruction?
RQ2: Based on a single CT framework, what did the students’ programming learning behaviors look
like?
RQ3: What potential background information (e.g., gender) influenced the students’ CT competence?
RQ4: What were young children’s learning responses to ScratchJr programming?
2. Educational importance of CT instruction
In her seminal paper, Wing (2006, p. 33) defined CT as “solving problems, designing systems, and
understanding human behavior by drawing on concepts fundamental to computer science.” Wing (2008)
further articulated that CT shares general analytical characteristics with mathematical thinking,
engineering thinking, and scientific thinking. Since then, although discussions centered on defining CT
have been controversial (e.g., Korkmaz, Cakir, & Ozden, 2017), the common core idea of CT training
in schools is to empower students to think like a computer scientist or engineer and subsequently
engage in a series of problem-solving learning tasks such as problem decomposition and pattern
recognition (Barr & Stephenson, 2011).
In recent years, because of potential educational benefits, CT has been considered a required skill for
digital citizens at several educational organizations. For example, the International Society for
Technology in Education (2018) has incorporated CT into one of its learning standards, thereby forcing
students to become computational thinkers for development of problem-solving strategies in a digital
society. To follow this educational trend, countries worldwide have begun to promote coding as a CT
training method in K-12 learning environments (e.g., micro:bit coding in U.K. schools; Schmidt, 2016).
However, according to an educational technology report (The Horizon Report, 2017), considerable
effort must be made by educators in developing innovative curricula when embracing coding for CT
instruction.
Hsu, Chang, and Hung (2018) conducted a meta-analysis of CT studies from 2006 to 2017 and
reported that many school teachers had adopted visual programming languages as coding tools for CT
instruction. Through a systematic literature review, Lockwood and Mooney (2017) identified a similar
trend involving visual programming languages; however, they reported that additional studies
regarding CT instructional design are urgently needed. After an in-depth analysis of intervention
studies, Lye and Koh (2014) found that all previous CT research focusing on visual programming
languages had been implemented in after-school activities and suggested that future CT studies must
investigate learning topics in naturalistic classroom settings. To respond to these research challenges,
this study mainly used one visual programming language to deliver CT instruction in a regular
computer class with a new instructional design approach at an elementary school.
3. CT instruction using visual programming languages
Using visual programming languages to develop students’ CT skills is reportedly an effective
learning strategy in elementary education (The Horizon Report, 2017). According to varied teaching
aids, the methods for delivering CT instruction can be grouped into three categories: instruction with
programming language only, instruction with programming language combined with educational
robotics, and instruction with programming language combined with electronic devices. The first
category focuses on CT instruction only and involves use of visual programming languages without the
aid of any technological hardware. For example, Saez-Lopez, Roman-Gonzalez, and Vazquez-Cano
(2016) evaluated the use of Scratch programming in an elementary school and reported that students
improved their CT skills in problem-based learning activities. The second category advances students’
learning experiences through a combination of visual programming languages and educational robotics.
For example, in Chou (2018a), elementary school students used Blockly programming to control mini
robots. Direct qualitative observations revealed that programming design may foster students’ CT
competence. The third category is similar to the second category but involves electronic devices rather
than educational robotics. For instance, Chou (2018b) examined drone use by integrating Tynker
programming in an after-school program and observed that elementary school students’ sequencing
skills (one component of CT skills) were significantly enhanced. In the present study, the first category
(instruction with programming language only) was adopted to answer the research questions.
Although previous studies regarding varied CT instruction methods have yielded positive learning
outcomes, whether students’ proficiency in CT skills covers problem-solving competencies that are
transferable to other domains remains questionable (Millwood, Walsh, & Hooper, 2018). In the 1980s,
use of the BASIC and LOGO programming languages for developing CT skills was heavily discussed.
Pea and Kurland (1984) argued that the learning to code (CT skills) effects on wider thinking
(knowledge transfer) still requires support from scientific evidence. Mayer, Dyck, and Vilberg (1986, p.
609) contended that “there is no convincing evidence that learning a program (CT skills) enhances
students’ general intellectual ability.” They further suggested that learning to program benefits only
students’ thinking skills directly related to programming languages already learned. However, in recent
years, with the advent of visual programming languages, some studies have yielded promising results.
In Lindh and Holgersson (2009), 1-year CT training through Lego robotics programming improved
some elementary school students’ logical-thinking skills. Chou (2018c) reported that one-semester CT
training through Arduino-based robotics programming may enhance elementary school students’
overall problem-solving skills. The present study focused on students’ cognitive-thinking skills (i.e.,
CT skills) identified in ScratchJr programming learning rather than knowledge transfer to other
subjects.
4. CT framework and instructional model
Using Scratch programming as an instructional example, Brennan and Resnick (2012) developed a
CT framework with three components, namely CT concepts, CT practice, and CT perspectives. This
framework has been widely used to interpret young children’s programming works (Lye & Koh, 2014).
In Brennan and Resnick’s framework, CT concepts refer to the fundamental concepts that students may
learn in the coding process. These concepts include sequences, events, parallelism, conditionals,
operators, and data. CT practice focuses on strategies that students use for coding and involves being
incremental and iterative, testing and debugging, reusing and remixing, and abstracting and
modularizing. CT perspectives are students’ “understandings of themselves, their relationships to
others, and the technological world around them” (p. 10) when building codes. CT perspectives include
expressing, connecting, and questioning. In the present study, a simplified version of Brennan and
Resnick’s CT framework was adopted to evaluate students’ programming behaviors because of the
limited features in ScratchJr and the relatively low cognitive development of young children.
To meet instructional needs, previous studies have developed various models to deliver CT
instruction to elementary school students. In Lee et al. (2011), the use–modify–create model was used
to efficiently impart CT concepts to young children. The “use” concept involves borrowing someone’s
idea. In the “modify” stage, students modify someone’s coding patterns, and then they subsequently
develop a new CT project in the “create” stage. To follow engineering design principles, Chou (2018c)
integrated a predesign, in-design, and postdesign model into a robot-programming curriculum. The
predesign and in-design stages are similar to the “use” and “modify” stages in the model of Lee et al.
The postdesign stage enables students to review and self-reflect on their CT projects. Chou (2018b)
proposed a three-stage learning progression model (copy–Tynker–create) for a drone-programming
curriculum. This model contains the necessary elements of the model of Lee et al. and differs only in
terms of stage titles. The present study is based on the model of Chou (2018b) and describes a new
three-stage model to fit curriculum requirements.
5. CT competence measurement and factors influencing CT competence
Various achievement tests have been designed to represent students’ CT competence. Many such
tests solicit students’ problem-solving skills through well-structured programming questions that cover
CT concepts and practice. Saez-Lopez et al. (2016) developed a visual block creative computing test to
assess elementary students’ CT competence after the students had received Scratch instruction. In Chou
(2018c), an achievement test from a national programming competition was used to measure
elementary school students’ understanding of Scratch programming. Strawhacker et al. (2018) devised
a battery of video-based programming tests that differed from traditional paper-based tests to assess
young children’s CT competence after they had completed ScratchJr training. These tests require
students to view programming questions in video clips and then provide answers on a structured
answer sheet. The present study employed the tests of Strawhacker et al. to observe students’ CT
progress in a computer class.
Few related studies have analyzed the factors influencing young children’s CT competence in a
programming class. Strawhacker et al. (2018) reported that different teaching styles affected students’
acquisition of programming knowledge in different manners. Chou (2018b) indicated that gender
influenced young children’s programming patterns. However, findings from Longi (2016) regarding
college students’ competence in learning programming may provide insight regarding the potential
factors influencing CT competence. Through a systematic literature review, Longi summarized that
two major factors, namely students’ background information and psychological characteristics, may
predict students’ learning performance in programming courses. Students’ background information
includes gender, prior programming learning experiences, and math skills. Psychological
characteristics include any valid psychological measurements used to assess students’ learning styles,
such as self-efficacy. In addition to the aforementioned two factors, Akinola and Nosiru (2014) found
that lecturers’ teaching styles and attitudes played key roles in students’ programming performance. In
the present study, only background information was collected for further analysis; this information
included that regarding gender, prior programming learning experiences, math skills, the assigned
instructional group, and extent of parental involvement.
6. Research methods
6.1. Research design
Because the study focused on one specific case (8-week CT educational training) in school (Creswell,
2007), a case study methodology was adopted to investigate the research questions. According to Yin
(2003), case study research might contain quantitative and qualitative elements. In the present study, a
one-group quasi-experimental pretest and posttest design with the support of qualitative observation
was used to examine four research topics: (a) CT competence progress, (b) programming behaviors in a
CT framework, (c) factors influencing CT competence, and (d) learning responses to CT training. Prior
to the experiment, students were asked to complete a CT competence test (pretest) to assess their prior
programming learning experiences. Subsequently, to facilitate the experimental process, a 2-week
programming orientation course was conducted for the students to enable them to comprehend the
fundamental knowledge of the visual programming language (i.e., the ScratchJr platform and
programming blocks). During the 8-week experiment, the students had to complete eight CT projects in
class, as well as take-home written assignments. Upon completion of the experiment, the students
received a CT competence posttest. The same CT competence test with different item numbers
(delayed posttest) was administered to the students 4 weeks after the posttest had been completed. Fig.
1 illustrates the research design of this study.
Fig. 1. Research design of this study.
6.2. Research participants
The principal researcher collaborated with a public elementary school in Taiwan that was attempting
to promote programming learning among young children by creating an experiment-based computer
class. Two students were randomly selected for this study from each of the six third-grade classes at the
school. Thus, the class included 12 third graders (six males and six females). A small class was
recruited because several programming behaviors of the students needed to be documented by the
instructor. This semester-long study was implemented in a regular classroom, and the 12 students were
randomly divided into three learning groups of equal-gender status. At the beginning of this study,
none of the students had programming experience, and all were aged less than 8 years. Moreover, no
tablet computers with ScratchJr were provided at the participants’ houses. Because one male student
dropped out of the class during the semester, only 11 datasets were used for further analysis. Fig. 2
depicts the learning environment in the computer class.
Fig. 2. A student learning to program in the classroom.
6.3. Research instrument
6.3.1. ScratchJr on the tablet computer
In the weekly computer class, the students used ScratchJr to engage in CT learning activities. Each
student was equipped with one small tablet computer (iPad mini). Students completed programming
projects individually. Four students (two males and two females) formed the learning group. After the
class, these students’ programming works were automatically saved on the tablet computers as weekly
learning evidence.
6.3.2. CT competence test
The learning achievement test developed by Strawhacker et al. (2018) was used to measure the
students’ CT competence. The test contains four categories: fixing the program, circling the blocks,
matching the program, and reverse engineering. The first three categories involve multiple-choice
questions, whereas the reverse engineering category requires the student to submit open responses.
Irrespective of the type of question, each test item is accompanied by one short video clip that
demonstrates a programming debugging process. This study adopted only the multiple-choice test
items to achieve objective test scores. The score range of the test was 0 to 9. When the instructor
administered the test, students were to carefully observe the video clips and circle the correct
programming blocks printed on the paper. The high reliability and validity of the test is reported in
Strawhacker et al.
6.3.3. Take-home assignments
In this study, weekly take-home assignments were developed for the students. The learning content
was consistent with what students had learned in class. Each assignment was a written assignment with
no requirement for ScratchJr use. However, one question in the assignment needed to be answered after
an instructional video had been viewed. Fig. 3 illustrates one of the weekly assignments.
6.3.4. Learning observation document
The instructor used a learning observation document to observe the students’ learning progress in
class. The document contained three parts: CT concept, CT practice, and active learning. A detailed
description of the document is provided in Tables 1 to 3. To achieve objective judgments, the principal
researcher and the instructor collaboratively discussed weekly observation results. If controversial
issues appeared in the discussion process, a triangulation method was used to examine data among
students’ programming works, the researcher’s field observations (or video clips), and the instructor’s
observation results.
6.3.5. Students’ background information
The students’ background information was collected, including information on the instructional
group, gender, extent of parental involvement, and math skills. This information is presented in Table 4.
When the students enrolled in the computer class, their parents were notified that they did not need to
accompany their children during assignment writing. After completion of the class, the parents were
asked about the extent of their assignment involvement through a telephone interview. The parents of
two students reported that they had not been involved in their children’s assignments.
6.3.6. Observation documents
The principal researcher conducted field observations to document student learning in class. Weekly
observations were recorded in a journal. Moreover, the instructor summarized the weekly reflection
notes after classes.
Fig. 3. Example of a take-home assignment.
CT Concepts
CT1: Sequence
CT2: Loops
CT3: Event
CT4: Parallelism
Table 1 CT concepts in the learning portfolio document.
Scratch Jr. Ability
Score
Able to sequence the blocks (Example: 1 (very poor) to 5 (very Good)
students may put the blocks in a
sequence)
Able to use loop blocks (Example:
1 (very poor) to 5 (very Good)
students may use several loop blocks to
simplify their programming projects)
Able to use event blocks (Example:
1 (very poor) to 5 (very Good)
students may add event blocks as more
as possible)
Able to trigger more than two objects
1 (very poor) to 5 (very Good)
spontaneously by using event blocks
(Example: students may use more
event blocks to trigger many objects)
Table 2 CT practice in the learning portfolio document.
Programming Behaviors
Score
Able to demonstrate testing and
1 (very poor) to 5 (very Good)
debugging behaviors (Example:
students may try to find the
programming errors by repeatedly
performing programming projects)
CTPR2: Reusing and
Able to modify someone’s works
1 (very poor) to 5 (very Good)
remixing
(Example: students may try to add
CT Practice
CTPR1: Testing and
debugging
additional programming blocks as
more as possible)
Table 3 Active learning in the learning portfolio document.
Learning Behaviors
Score
Able to keep focus on the project in
1 (very poor) to 5 (very Good)
the class (Example: students may do
other tasks not related to
programming projects.)
AL2: Assignment
Able to complete take-home
1 (very poor) to 5 (very Good)
assignments (Example: the instructor
may check if students completed
take-home assignments in the
beginning of the each class)
Active Learning
AL1: Learning
attention
Background Information
Instructional group
Gender
Parent-involvement time
Math skill
Table 4 Students’ background information.
Description
Three learning groups (1~3)
1 (male), 2 (female)
1 (frequent involvement: parents accompanying
students during assignment writing)
2 (less involvement: No accompany or
withdraw early during assignment writing)
Based on a median score in math subject in the
previous semester year, students were
segregated into two groups: 1(above average)
and 2 (under average)
Status (number)
1(4), 2 (4), 3(3)
1(5), 2(6)
1(5), 2(6)
1(5), 2(6)
6.4. Instructional design
Various simple and complex CT projects were developed for the educational experiments (see
Appendix). Only one instructor taught the weekly classes, each of which was scheduled as a 2-h
learning session (including a 20-min break). A three-stage instructional model (Fig. 4) was used to
facilitate the students’ learning progress. In the review stage (approximately 20 min), the students
discussed their take-home assignment. In the copy stage (approximately 40 min), the instructor
imparted the project requirement and some programming skills. Subsequently, the students practiced
programming examples from learning materials. In the modify stage (approximately 40 min), the
students modified the programming examples by adding as many of their own ideas as possible. They
were encouraged to incorporate all the CT concepts (sequence, loops, event, and parallelism) into their
projects. During project development, the instructor observed the students’ performance by walking
around the classroom and assisted the students when learning problems occurred.
Fig. 4. Three-stage instructional model.
6.5. Data analysis
Descriptive statistics, the t test, a one-way analysis of variance (ANOVA), Spearman correlation
analysis, and partial correlation analysis were used to investigate three types of quantitative data: CT
competence progress, coding learning behaviors, and factors influencing CT competence. The students’
responses were summarized in a qualitative format through the categorization of content analysis
(Neuendorf, 2002). Moreover, a data triangulation method (Patton, 2002) was employed to confirm
data consistency among the students’ levels of programming engagement, the instructor’s class
observations, and the researcher’s observations (Fig. 5).
Fig. 5. Data triangulation in this study.
7. Results and discussion
7.1. CT competence progress
The results of the t test are summarized in Table 5. The findings indicated that the students’ CT
competence had significantly improved (t = 16.78, p < 0.01) upon completion of the 8-week
educational training course but had significantly worsened (t = 2.78, p < 0.05) one month after
completion of the experiment. However, from a mean difference (0.55) perspective, the students’
learning retention of CT knowledge did not decrease considerably.
Table 5 Results of the t test.
Item
Mean Difference
DF
Posttest–Pretest
6.5
10
Delayed Posttest–Posttest
0.55
10
Pretest (M = 1, SD = 0.77); posttest (M = 7.5, SD = 1.11)
Delayed posttest (M = 6.95, SD = 1.33)
*p < 0.05; **p < 0.01
t
16.78
2.78
p
0.00**
0.02*
7.2. Programming learning behaviors
Table 6 presents the descriptive statistics and t-test results for the CT concepts and practice. The
findings revealed that the students demonstrated proficiency on the sequence (CT1/M = 4.78), event
(CT3/M = 4.03), and parallelism (CT4/M = 4.08) concepts but did not exhibit good performance on the
loops concept (CT2/M = 3.6). Moreover, the students’ coding behaviors for testing and debugging
(CTPR1/M = 4.13) and for reusing and remixing (CTPR2/M = 4.31) exhibited adequate scores. This
study used 3.5 as a comparative standard for further t-test analysis. The findings indicated that the
students’ CT concepts regarding sequence (t = 11.54, p < 0.01), event (t = 3.38, p < 0.05), and
parallelism (t = 4.05, p < 0.01) and CT practice regarding testing and debugging (t = 3.05, p < 0.05)
and reusing and remixing (t = 4.93, p < 0.01) significantly improved.
The students’ weekly progress in the CT concepts and practice is illustrated in Figs. 6 and 7. The
findings indicated that the students’ proficiency improved steadily for the sequence (CT1) concept. The
students’ proficiency for the loops (CT2), event (CT3), and parallelism (CT4) concepts fluctuated
considerably; however, high proficiency was achieved for these three concepts in the final 2 weeks of
class. Although the students’ testing and debugging (CTPR1) and reusing and remixing (CTPR2)
behaviors also fluctuated considerably, their weekly performance surpassed the upper-intermediate
threshold (4).
Table 6 Descriptive statistics and t-test results for the CT concepts and practice.
Category
M
SD
t
DF
p
CT1: Sequence
4.78
0.37
11.54
10
0.00**
CT2: Loops
3.60
0.38
8.88
10
0.40
CT3: Event
4.03
0.52
3.38
10
0.01*
CT4: Parallelism
4.08
0.48
4.05
10
0.00**
CTPR1: Testing and debugging
4.13
0.68
3.05
10
0.01*
CTPR2: Reusing and remixing
*p < 0.05; **p < 0.01
4.31
0.54
4.93
10
0.00**
5
4.5
4
3.5
3
CT1
2.5
CT2
CT 3
2
CT4
1.5
1
0.5
0
1
2
3
4
5
6
7
8
Week
Fig. 6. Learning progress of the CT concepts.
5
4.8
4.6
4.4
4.2
4
CTPR1
3.8
CTPR2
3.6
3.4
3.2
3
1
2
3
4
5
6
7
8
Week
Fig. 7. Learning progress of the CT practice.
Spearman correlation analysis was performed to confirm the interrelationships between the CT
concepts (summing up weekly scores), CT practice (summing up weekly scores), and project difficulty
(determination by complexity of the block use in each project); the results are listed in Table 7. The
findings indicated that CT1 (r = 0.84, p < 0.05), CT2 (r = 0.96, p < 0.01), and CT3 (r = 0.87, p < 0.05)
positively correlated with project difficulty. Thus, when the project difficulty increased, the students’
CT concepts (sequence, loops, and event) improved.
Table 7 Results of Spearman correlation analysis.
Item
CT1
CT2
CT3
CT4
CTPR1
Project difficulty: r (DF=9) 0.84
0.96
0.87
0.67
0.42
p
0.01*
0.00** 0.01*
0.07
0.30
*p < 0.05; **p < 0.01
CTPR2
0.26
0.54
7.3. Factors influencing CT competence
The results of the one-way ANOVA for the posttest and delayed posttest are summarized in Tables 8
and 9, respectively. The findings indicated that the students’ background information did not influence
their CT competence after 8 weeks of CT training. However, in the delayed posttest, the extent of
parental involvement exerted a significant influence (F = 7.74, p < 0.05). Moreover, partial correlation
analysis was performed to confirm whether the students’ in-class learning behaviors influenced their
CT competence. The results presented in Tables 10 and 11 indicate that no relationship between inclass learning behaviors and CT competence was observed after removal of the effect of the pretest.
Table 8 Results of the one-way ANOVA for the posttest.
Item
SS
DF
MS
F
Instructional group
4.96
2
2.48
2.63
Gender
1.47
1
1.47
1.20
Parent-involvement time
4.49
1
4.49
5.05
Math skill
0.00
1
0.00
0.00
p
0.13
0.30
0.05
1
Table 9 Results of the one-way ANOVA for the delayed posttest.
Item
SS
DF
MS
F
p
Post Hoc
Instructional group
4.56
2
2.28
1.39
0.30
Gender
2.72
1
2.72
1.64
0.23
Parent-Involvement time
8.20
1
8.20
7.74
0.02*
A>B
Math skill
0.03
1
0.03
0.01
0.91
A (frequent involvement): M = 7.9, SD = 1.14; B (less involvement): M = 6.17, SD = 0.93
Table 10 Results of partial correlation analysis for the posttest.
Item
AL1
AL2
CT
CTPR
Posttest: r (DF=9)
-0.08
0.46
0.29
0.51
p
0.83
0.18
0.41
0.13
Table 11 Results of partial correlation analysis for the delayed posttest.
Item
AL1
AL2
CT
CTPR
Delayed Posttest: r (DF=9)
-0.10
-0.25
0.15
0.27
p
0.79
0.49
0.68
0.45
7.4. Learning responses to ScratchJr programming
According to the results of content analysis, the qualitative data from the instructor and researcher
were grouped into nine categories: group learning, programming language, tablet computer, in the
review stage, in the copy stage, in the modify stage, CT concepts, CT practice, and learning focus. The
representative quotations in the results are summarized in Table 12. Overall, the qualitative findings
could serve as an additional resource to support the quantitative data.
Table 12 Observation results from the instructor and researcher.
Category
Representative quotations
Group learning
1. “When the students encountered problems, their first
reaction was to seek help from their peers in the group
rather than the instructor.” (Instructor note)
2. “The more skillful students in the group attempted to help
those who were behind the project schedule.” (Researcher
journal)
Programming language
1. “The students enjoyed ScratchJr programming. The easyto-use interface attracted their attention.” (Instructor note)
2. “I asked some students in class. The students responded
3.
Tablet computer
1.
2.
In the review stage
1.
2.
3.
4.
In the copy stage
1.
2.
3.
In the modify stage
1.
2.
3.
CT concepts
1.
2.
3.
CT practice
1.
2.
3.
that ScratchJr programming was like playing a game.
They felt so excited!” (Researcher journal)
“A few students complained about the limited range of
features in ScratchJr because they wanted to create
fancier projects.” (Researcher journal)
“The students preferred the tablet computers to desktop
computers perhaps because they could easily understand
how to use them.” (Instructor note)
“I observed an advantage in using tablet computer
programming. The students often carried their tablet
computers around to discuss their work with peers.”
(Researcher journal)
“Certain students often did not complete the assignments
and often skipped the difficult ones.” (Instructor note)
“The students wanted me to accelerate the assignment
discussion so that they could have additional time for
programming.” (Instructor note)
“When the instructor discussed the assignment, most of
the students seemed to lose focus because they wanted to
play ScratchJr on their tablet computers.” (Researcher
journal)
“A few students reported that their parents wanted them
to demonstrate what they had learned in class. Some
students reported that their parents monitored their
assignment progress at all times.” (Researcher journal)
“The students had no problems retaining what the
instructor had taught them in class. They required only
time to assemble the programming blocks.” (Instructor
note)
“After receiving the weekly programming knowledge, the
students were eager to practice programming examples.”
(Researcher journal)
“Some skillful students challenged the main topic in the
weekly schedule and attempted to express their creative
ideas.” (Researcher journal)
“The students did not add additional programming blocks
if the instructor had not encouraged them to do so.”
(Instructor note)
“Some of the students attempted to create complex works
by adding many programming blocks. They showcased
their works among various learning groups.” (Researcher
journal)
“The students appeared content with the work they had
completed if the instructor did not encourage them to
modify their programming content or add new content.”
(Researcher journal)
“The students were able to easily assemble programming
blocks.” (Instructor note)
“The students did not like using loops blocks even though
they understood the purpose of said blocks.” (Instructor
note)
“The students enjoyed using the event and parallelism
blocks to decorate their programming
projects.”(Instructor note)
“The students were able to easily modify their
programming examples but required time to think about
how to add additional blocks.” (Instructor note)
“The instructor provided instant support to students who
encountered debugging troubles.” (Researcher journal)
“Several of the students helped their peers to find
4.
Learning focus
1.
2.
solutions to programming problems.” (Instructor note)
“The students were able to independently engage in the
testing and debugging process. However, sometimes they
did not obtain the results that they wanted.” (Researcher
journal)
“If a student had felt unwell in a previous class, their
negative emotions influenced their learning focus during
project development.” (Instructor note)
“Some students often used other apps on their tablet
computers, and a few often chatted with their peers. The
instructor would notify them to return to the learning
schedule.” (Researcher journal)
Regarding the information in Table 12, nine qualitative categories can be summarized into four
themes: regular learning behaviors (group learning and learning focus), technology use (programming
language and tablet computer), instructional model (review, copy, and modify stages), and CT learning
behaviors (CT concepts and practice). In the first theme, although some students often lacked
concentration in class, most of students exhibited a peer support for project development. The second
theme states that most of students enjoyed using Scratch Jr. in their tablet computers. In the third theme,
a learning problem appeared in the review stage where several students disliked assignment discussion
because no technological tools were used in the review stage. Few learning problems were identified in
the copy and modify stages. However, continuous encouragement from the instructor was needed for a
better programming project in the modify stage. The forth theme demonstrates how students applied
CT concepts and practice into their project development. Students could put CT concepts into practice,
particularly for sequence, event and parallelism, but they disliked using loops blocks to build their
projects. As for CT practice, most of students showed their active engagement in programming testing
and debugging.
7.5. Overall discussion
7.5.1. Response to RQ1
After the 2-week programming orientation course, the students began to incorporate what they had
learned into various theme-based projects. The quantitative findings indicated that weekly ScratchJr
training significantly improved the students’ CT competence. Thus, with the aid of the 2-week
orientation course, the 8-week CT project training course efficiently fostered young children’s CT
competence. These results are supported by those of Saez-Lopez et al. (2016) and Chou (2018c). In
Saez-Lopez et al. (2016), elementary school students engaging in Scratch activities exhibited
significant improvement on a programming (or CT) knowledge test. Moreover, in Chou (2018c),
significant learning progress was identified on a Scratch achievement (or CT) test after elementary
school students had immersed themselves in weekly educational robotics programming classes.
However, one month after completion of the experiment, although students’ learning retention of CT
knowledge had remained at an acceptance level, their CT competence had significantly worsened. A
possible explanation was that a small sample size and small score range of the test led to the statistical
significance.
7.5.2. Response to RQ2
During the 8-week CT project training course in the present study, the students demonstrated diverse
programming learning patterns for CT concepts. From a weekly progress perspective, the students
exhibited higher competence in all the CT concepts (sequence, loops, event, and parallelism) in the
final week of training than in the previous weeks, even though their competence in the CT concepts
had fluctuated considerably over the preceding 7 weeks. Inferential statistics revealed that the students
had improved significantly in all the CT concepts (except for loops). A possible reason for the
weakness in the loops concept was identified in the instructor’s notes, which indicated that the students
had not liked to loops blocks. This finding was in agreement with an observation of Chou (2018a, b),
that elementary school students had preferred not using loops blocks to design their programming
works. One finding of the present study was that most of the CT concepts (sequence, loops, and event)
were significantly related to project difficulty. When the project difficulty increased, the students’ CT
concepts sometimes improved; a reasonable explanation behind this phenomenon is that a more
difficult project may promote the idea of using additional CT concepts.
The learning patterns of CT practice differed from those of CT concepts. The students’ weekly CT
practice (testing and debugging as well as reusing and remixing) fluctuated considerably and was
unrelated to project difficulty; however, the weekly CT practice remained at an upper-intermediate
level from week to week. In addition, the inferential statistics indicated that the 8-week CT project
training course significantly enhanced the students’ CT practice. These findings can be attributed to
two factors, namely learning support provided by peers and the instructor and the learning process in
the instructional design model. First, the qualitative data indicated that the students and instructor had
formed a strong support network where the students were able to seek help during programming
development. Such learning scaffolding (Jonassen, 1999; Donohue, 2015) indirectly strengthened the
students’ willingness for programming testing and debugging. Second, the second stage (copy) in the
instructional design model enabled the students to practice programming examples and further increase
their understanding of reusing and remixing, which was a required element in the final stage (modify)
of the instructional design model.
7.5.3. Response to RQ3
The results of the CT competence pretest indicated that the students had little programming
knowledge upon their enrolment in the computer class. The starting points of programming learning for
all students were almost the same. After the 8-week CT project training course, the inferential statistics
indicated that the collected background information (instructional group, gender, extent of parental
involvement, and math skills) did not influence the students’ CT competence. Because this study
focused on elementary school students, the findings differed considerably from those of an adult
learning study that identified effects of background information (e.g., gender, math skills) on
programming (CT) competence (Longi, 2016). However, one month after project completion, the
extent of parental involvement exerted a strong influence on the students’ CT competence. Thus,
frequent parental involvement in assignments may strengthen students’ learning retention of CT
knowledge. This result was interpreted through the qualitative data, which also indicated that students
may pay more attention to assignments when in their parents’ company, and frequent parental
involvement might reinforce students’ CT knowledge recall. Similar to background information, the
students’ in-class learning behaviors (CT concepts, CT practice, and active learning) were also not
related to their CT competence. A possible reason for this outcome is that the students’ in-class
learning behaviors reflected only their learning status in weekly projects and did not indicate their CT
competence.
7.5.4. Response to RQ4
The qualitative learning responses from the researcher and instructor indicated learning advantages of
ScratchJr (software) and the tablet computers (hardware) in the computer class. Although a few
students expressed displeasure regarding the limited range of features, the game-based interface of the
ScratchJr platform enabled the students to easily explore potential programming features (Falloon,
2016; Strawhacker et al., 2018). Furthermore, the mobility of the tablet computers empowered the
students to conduct their programming and discuss programming problems with their classmates (Chou
& Feng, 2019). Regarding the instructional design model, the qualitative data indicated that the
students seemed to lose learning interest in the first stage (review) because the review activity of takehome assignments was similar to the traditional educational format. After progressing to the second
stage (copy), the students did not face any learning problems in practicing programming examples.
However, in the final stage (modify), the students were well satisfied with their performance if the
instructor did not prompt them to work further. Continual encouragement from the instructor was
required for superior project development (Chou, 2018c).
8. Conclusion
8.1. Contributions of this study
This study investigated how elementary school students used a visual programming language
(ScratchJr) to develop their CT competence through various programming projects in an experimentbased computer class. The findings confirmed that an 8-week CT project training course significantly
fostered young children’s CT competence. The students exhibited high retention of CT competence one
month after completion of the programming curriculum. Furthermore, although different learning
patterns appeared in weekly CT concepts and practice, the students’ overall CT concepts (except loops)
and practice had significantly improved after programming project training. This study identified that
the parent-involvement time in students’ assignment activities may play a key role in students’
retention of CT competence. In addition to the quantitative evidence, the qualitative findings proved
that young children can enjoy using tablet computers to learn ScratchJr programming. The students
exhibited various learning behaviors in the three-stage instructional design model. For example, several
students did not pay attention to assignment discussion in the review stage; most of students had no
problems copying the instructors’ programming examples in the copy stage; most of students need the
instructor’s encouragement to add additional programming blocks in the modify stage.
8.2. Research limitations and suggestions for future studies
The results of the study may be difficult to generalize to other learning settings because the research
design process had several limitations. First, this study recruited only a small number of students to
minimize the teaching tasks of the instructor. Future studies may increase the number of students (e.g.,
25) to further investigate the topic of the present study. Second, this study focused on students’
programming learning progress during CT project development. Future studies may employ a content
analysis approach to examine students’ programming works. Third, the proposed instructional design
model forced the students to repeatedly modify programming examples. Further research could strive
to determine whether encouraging students to develop new programming works benefits their learning.
Forth, the study only adopted one-group quasi-experimental pretest and posttest design. Future studies
may set up a control group to provide a better research comparison. Fifth, the qualitative observation
revealed that the take-home worksheets might demotivate student learning. Future studies may confirm
the effect of the assignment on students’ CT competence development. Sixth, the study employed one
specific learning achievement test (multiple-choice items) to measure students’ CT competences.
Future studies may use or develop non selected-response test items to assess young children’s CT
competences. Different CT learning patterns may be obtained. Seventh, the parental involvement did
have an effect on students’ CT competences in the study. To identify students’ intrinsic motivation,
future studies may require students to complete worksheets before they go home. Finally, the
collaborating elementary school in this study provided only third graders as participants; students’ ages
may influence their cognitive development during programming training. Future studies could recruit
children younger than those in this study for comparison of programming patterns based on age.
8.3. Instructional implications
Because this study investigated the instructional effectiveness of ScratchJr integrated into a computer
class in an elementary school, the findings may provide some practical suggestions for educators
attempting to promote coding literacy among young children. First, learning interaction among peers
may influence students’ debugging and testing abilities. Instructors could employ a range of
collaborative learning strategies to facilitate students’ programming engagement. Second, classroom
management problems such as students using other apps during computer classes may affect students’
learning focus. Instructors could design reward rubrics to prevent such problems. Finally, depending on
the educational level of the students involved, the instructional design model proposed in this study
could be modified to fit an expected course structure. For young children, instructors could remove the
review process from the model to decrease students’ cognitive learning load.
References
Akinola, O. S., & Nosiru, K. A. (2016). Factors influencing students’ performance in computer
programming: A fuzzy set operations approach. International Journal of Advances in Engineering
& Technology, 7(4), 1141-1149.
Barr, D., Harrison, J., & Conery, L. (2011). Computational thinking: A digital age for everyone.
Learning & Leading with Technology, 38(6), 19-23.
Brennan, K., & Resnick, M. (2012). New frameworks for studying and assessing the development of
computational thinking. In Annual American Educational Research Association Meeting,
Vancouver, BC, Canada.
Chou, P.-N. (2018a). Little engineers: Young children's learning patterns in an educational robotics
project. In Annual World Engineering Education Forum Conference. Albuquerque, New Mexico,
USA.
Chou. P. -N. (2018b). Smart technology for sustainable curriculum: Using drone to support young
students’ learning. Sustainability, 10(10), 3819, 1-17.
Chou, P.-N. (2018c). Skill development and knowledge acquisition cultivated by maker education:
Evidence from Arduino-based educational robotics. EURASIA Journal of Mathematics, Science
and Technology Education, 14(10),1-15.
Chou, P.-N.,& Feng, S.-T. (2019).Using tablet computer application to advance high school students’
laboratory learning experiences: A focus on electrical engineering education. Sustainability, 11(2),
381,1-14.
Creswell, J. (2007). Qualitative inquiry and research design: Choosing among five approaches (2nd
ed.). Thousand Oaks, CA: Sage.
Donohue, C. (Ed) (2015). Technology and digital media in the early years: Tools for teaching and
learning. NY: Routledge.
Falloon, G. (2016). An analysis of young students’ thinking when completing basic coding tasks using
Scratch Jnr. on the iPad. Journal of Computer Assisted Learning, 32, 576-593.
Hsu, T.-C. , Chang, S.-C., & Hung, Y.-T. (2018,). How to learn and how to teach computational
thinking: Suggestions based on a review of the literature, Computers & Education, 126, 296-310.
International Society for Technology in Education (2018). Computational thinking competences.
Assessed March 3, 2018, from: https://www.iste.org/standards/computational-thinking
Jonassen, D. (1999). Designing constructivist learning environments. In Instructional-Design Theories
and Models: A New Paradigm of Instructional Theory, Reigeluth, C.M., Ed. Lawrence Erlbaum
Associates: Hillsdale, NJ, USA.
Korkmaz, O, Cakir, R., & Ozden, Y. (2017). A validity and reliability study of the computational
thinking scales. Computers in Human Behavior, 72, 558-569.
Lockwood, J., & Mooney, A. (2017). Computational thinking in Education: Where does it fit? A
systematic literary review. International Journal of Computer Science Education in Schools, 2(1),
41-60.
Longi, K. (2016). Exploring factors that affect performance on introductory programming courses.
Unpublished Master Thesis, Department of Computer Science, University of Helsinki.
Lye, S. Y., & Koh, J. H. L. (2014). Review on teaching and learning of computational thinking through
programming: What is next for K–12? Computers in Human Behavior, 41, 51-61.
Lee. I, Martin, F., Denner, J., et al. (2011). Computational thinking for youth in practice. ACM Inroads,
2(1), 32-37.
Mayer, R. E., Dyck, J., & Vilberg, W. (1986). Learning to programme and learning to think: What’s
the connection? Communications of the ACM, 29, 605-610.
Neuendorf, K.A (2002). The content analysis guidebook. Thousand Oaks, CA: Sage.
Pea, R., & Kurland, D. M. (1984). On the cognitive effects of learning computer programming. New
Ideas Psychology, 2, 137-168.
Patton, M.Q. (2002). Qualitative Research and Evaluation Methods(3rd edition).; Sage: Thousand
Oaks, CA, USA.
The Horizon Report (2017). K–12 edition. Accessed January 1, 2018 from:
https://www.nmc.org/nmc-horizon-k12/
Resnick, M., Maloney, J., Monroy-Hernandez, A., et al. (2009). Scratch: Programming for all.
Communications of the ACM, 52(11), 60-67.
Saez-Lopez, J., Roman-Gonzalez, M., & Vazquez-Cano, E. (2016). Visual programming languages
integrated across the curriculum in elementary school: A two year case study using Scratch in
five schools. Computers & Education, 97, 129-141.
Schmidt, A. (2016). Increasing computer literacy with the BBC micro:bit. IEEE Pervasive Computing,
15(2), 5-7.
Strawhacker, A., Lee M., & Bers, M. U. (2018). Teaching tools, teacher’s rules: Exploring the impact
of teaching styles on young children’s programming knowledge in Scratch Jr. International
Journal of Technology and Design Education, 28(2), 347-376.
Wing, J. M. (2006). Computational thinking. Communications of The ACM, 49(3), 33-35.
Wing, J. M. (2008). Computational thinking and thinking about computing. Philosophical Transactions
of The Royal Society, 366, 3717-3725.
Yin, R. (2003). Case study research design and methods (3rd ed.). Thousand Oaks, CA: Sage.
Appendix (weekly CT projects)
Week
1
CT Project Name
Skating
Description
To allow the cat to skate to the goal
and come back to the starting point
(Difficulty:★)
2
Eating bugs
To allow the hungry frog to eat many
bugs
(Difficulty:★)
3
Claw machine
To catch all the animals into the box
(Difficulty:★)
4
Soccer
To hit the ball to the net and answer
the math questions
(Difficulty:★★)
5
Racing
To allow all animals to play in a race
(Difficulty:★★)
6
Constellation
To allow the crab to walk in the
constellation pattern
(Difficulty:★★)
7
Coin collecting
To control the octopus to catch the
falling coins
(Difficulty:★★★)
8
Maze
To control the elf to eat all the dots in
the maze
(Difficulty:★★★)
Download