Teacher Self-Efficacy and Usage

advertisement
Teacher Self-Efficacy and Usage: The Case of the XO Laptop in Alabama
Shaundra Bryant Daily
School of Computing, Human-Centered Computing Division
Clemson University
United States
sdaily@clemson.edu
Shelia Cotten
Department of Sociology
The University of Alabama at Birmingham
United States
cotten@uab.edu
Philip Gibson
Department of Sociology
University of Alabama at Birmingham
United States
pagibson@uab.edu
Michael Howell-Moroney
School of Urban Affairs and Public Policy, Division of Public and Nonprofit Administration
The University of Memphis
United States
mhwllmrn@memphis.edu
LaToya O’Neal
Department of Sociology
University of Alabama at Birmingham
ljoneal@uab.edu
Abstract. As with any introduction of new technology, especially in schools, teacher acceptance is
key. While much research has shown that increasing teacher self-efficacy and skill can support
adoption, there are other cultural, social, and political barriers that can inhibit adoption. In this
paper, we describe the results of a summer institute and in-service professional development
sessions that reintroduced the XO laptop to teachers, supported them in understanding its features,
provided them with lesson plans to support their math and science curricular goals, and created
opportunities for them to design their own lesson plans. Our findings suggest that intensive
interventions such as summer institutes as well as professional development have a positive effect
on self-efficacy toward a novel platform, but not necessarily the usage and adoption of new
technologies.
The amount and types of technology utilized in the classroom has vastly expanded in recent decades.
Educators have introduced electronic portfolios, Smart Boards, virtual worlds, the Nintendo Wii, Social media
technologies (e.g., Facebook, Twitter), and netbooks with varying levels of success (Cuban, 2003). Our primary
goal in this research has been to increase teacher self-efficacy toward a novel platform: the XO laptop. Although the
XO laptop looks like many netbooks, the graphical user interface, called Sugar, is different than a Windows or Mac
interface that teachers might use or have prior experience with. The objective of this paper is to examine whether
participation in a structured intervention involving summer teacher institutes and professional development sessions
are associated with increased teacher self-efficacy and usage of the XO laptop.
Self-efficacy, defined by Bandura, is a perception about one’s abilities within a given domain that, when
high, will provide positive support for action. These beliefs may influence many aspects of behavior, including
emotional response to the success of an endeavor, course of action, and the amount as well as duration of effort put
forth (Bandura, 1997, p.3). There are numerous influences on computer self-efficacy beliefs, including mastery
experiences (success in task performance), vicarious experiences (observing successful task performance), social
influences (others expressing positive persuasion regarding successful task performance), and physiological as well
as affective states (one’s perception or belief in the implications of physiological responses during task performance)
(Feltz & Lirgg, 2001). In the computing domain, computer self-efficacy (CSE) refers to the belief in one’s ability to
use computer technology. Those with a lower sense of self-efficacy will be easily frustrated by stumbling blocks to
their performance that will lead to decreased perceptions of their capabilities to use computers. As a result of
diminished perceptions, they will be less willing to use computers in the future. Conversely, individuals with a high
sense of self-efficacy are not easily discouraged by setbacks. They will persist in the face of challenges and will be
more likely to overcome obstacles. Success strengthens their resolve to continue to use computers as a classroom
tool (Cuban, 2003). For teachers, there is evidence to suggest that computer self-efficacy is more important than
skills and knowledge about computers when it comes to actually using computers in classroom instruction (Ertmer
& Ottenbreit-Leftwich, 2010; Suur-Inkeroinen & Seppanen, 2011).
The XO Laptop
In 2005, a consortium headquartered in Cambridge, Massachusetts formed “One Laptop Per Child.” This
initiative emerged from the vision of many people who also were a part of the Massachusetts Institute of
Technology Media Laboratory. It included Nicholas Negroponte, Seymour Papert, and David Cavallo. Nicholas
Negroponte’s and the non-profit’s mission envisaged locations outside of the United States. According to their
website, OLPC’s mission is:
“…to create educational opportunities for the world's poorest children by providing each child with a
rugged, low-cost, low-power, connected laptop with content and software designed for collaborative,
joyful, self-empowered learning. When children have access to this type of tool they get engaged in their
own education. They learn, share, create, and collaborate. They become connected to each other, to the
world and to a brighter future.” (OLPC, 2011)
The XO laptop was developed based on theories of constructionist learning; the idea being that students
learn best by doing, creating, and experimenting in collaboration (Papert, 1993). The XO laptop also emphasized
collaborative learning through the use of a unique user interface and “mesh networking” that allows students to see
others nearby and invite them to work together on projects. All of the XO software, including the operating system,
is open-source, and highly customizable, thus providing students and teachers the ability to customize the XO in
ways that better fit their own learning and teaching styles. XO laptops include various Activities (programs or
applications) geared towards supporting the envisioned goals.
Method
Participants
Birmingham City School administrators were asked to select six schools to participate in Year 2 of the
intervention (2010-2011). Principals of the selected schools were contacted in the spring of 2010 to determine
interest and all principals agreed to their school participating. They informed teachers of the opportunity to
participate in a weeklong (35 hours) training institute on integrating computing across the curriculum. Institutes
were open to teachers from Year 1 and Year 2 schools. Of those attending the summer institutes, 65% were from
the Year 2 schools. Our sample consisted of 33 fourth and fifth grade teachers from the six schools participating in
the intervention during the 2010-2011 school year. Our sample is 82% African American and 82% female. The
mean age of participants was 41.62 years and the average years of teaching experience was 13.
Study Context
In 2008, 15,000 laptops were purchased by the mayor of Birmingham, Alabama and given to the
elementary school students in the city’s school district. Since the school district had not previously approved the
purchase, a compromise was a pilot at one of the district’s elementary schools. To support this pilot, four consultants
were sent to Birmingham in the month of April to work with the pilot school teachers, and, in particular, five fifth
grade teachers. In addition, and a media specialist was selected to assist. As part of the agreement, the teachers were
to receive a week’s worth of training on the basic functions of the laptop. Two consultants then returned for the
summer to support further training in the district. The pilot received approval from the school board and laptops
were distributed to the rest of the district.
Teachers whose students received laptops participated in professional development that consisted of, on
average, only two hours’ worth of introduction to the novel Sugar interface on the laptop. This dearth of preparation
resulted in negative attitudes towards the XOs, and diminished utilization of the platform in classrooms across the
district (Cotten, Hale, Moroney, O’Neal, & Borch, 2011; Warschauer, Cotten, & Ames, 2012). During the year
following the distribution of the laptops, the authors received grant funding to provide ongoing professional
development to the teachers in the district focused on learning how to integrate this foreign platform into their
everyday classroom practices.
Intervention
Our approach to supporting teachers in integrating this novel platform blended summer institutes as well as
in-service professional development. We grounded our method in research showing that change in teacher
confidence toward technology can take an extended amount of time (Brinkerhoff, 2006) and that implementation in
small doses can be key to impacting teacher self-efficacy (Ringstaff & Yocam, 1995). At summer institutes, we
worked to expand teacher vision for the utility of the laptop beyond a tool for supporting lecture-based instruction
(Pellegrino, Goldman, Bertenthal, & Lawless, 2007). We also demonstrated that novel uses of programs could
enhance their lessons (Ertmer, Conklin, & Lewandowski, 2001).
At the beginning of the school year (Fall 2010), meetings were held with principals and teachers to discuss
the integration model for in-service training. We offered a total of six training topics for the 2010-2011 school year.
During the in-service professional development, we attempted to align the experiences with existing pedagogical
beliefs and knowledge (Ertmer & Ottenbreit-Leftwich, 2010) by developing and working with teachers to develop
their own math and science lesson plans. The goals of the first session were to provide a refresher on the XO laptop
and introduce teachers to ways of using the XO activities – Record, Write, and Paint. Teachers were provided with
course of study aligned to lesson plans that used those activities. In session 2, we showed teachers how to use the
XO activity Memorize to review fractions. Teachers and students were incorporated into the third session that
covered minor repairs to the XO. The last three sessions were focused on the visual programming language Scratch.
We introduced teachers to the basics as well as how to create movies, animations, and games. Each school received
at least three of the six training sessions. Training sessions were not mandatory so every teacher may not have
received the same amount of training.
Measures
Data were collected via surveys at two different time periods: the beginning of the school year (August, the
pre-test) and the end of the school year (May, the post-test). Only post-test surveys were used in this particular
analysis, due to the research questions being examined. The key dependent variables included teacher judgment of
overall skill level using the XO laptop for instruction (0=Novice to 4=Expert). In addition, we noted how much they
used each of the core applications (i.e., x, y, and z) on the XO laptops (0=None to 4=Almost every day or every
day). The main independent variables included teacher institute participation (1=yes) and number of professional
development sessions attended (count variable).
Control variables were also used in the analysis. Teacher experience was included as a continuous variable,
indicating the number of years a participant had been a teacher at the time of the study. Gender was measured as a
dichotomous variable, with 0 indicating male and 1 indicating female. Race was also measured as a dichotomous
variable due to the racial homogeneity of the sample, with 0 indicating that the participant was not African
American and 1 indicating that the participant was African American.
Analytic Design
We begin our analysis by examining the effects of the summer institutes and professional development on
XO efficacy. Our primary question of interest is the degree to which the summer institutes and professional
development affected efficacy. To study the effects, we estimated a series of ordered logit models. We measured
summer institute participation using a simple binary indicator (1 = if attended the summer institute, 0 = otherwise).
For professional development, we developed an ordinal count measure that counts the number of professional
development sessions a teacher attended. This measure varies from 0 to 5. We used clustered standard errors to
account for intraclass correlation at the school level.
Five models were estimated predicting XO efficacy. The first and second models include the isolated
effects summer institutes and professional development variables, respectively. Model 3 looks at the partial effects
of both the summer institutes and the professional development variables. Model 4 includes an interaction effect of
the summer institutes and professional variables in order to test the joint effects of both interventions on XO
efficacy. Model 5 includes all variables in Model 4 along with the control variables for gender, race, and teaching
experience.
The other focus of this study is the use of XO applications in a classroom setting. In order to analyze how
our interventions affected usage, we created a summative scale based on usage of ten applications on the XO
platform (alpha=.915). We then estimated five models predicting this XO usage scale using ordinary least squares
regression. The models for this analysis use the same independent variables as the models for XO efficacy. We also
used cluster robust standard errors in this model to account for within-school correlation between teachers.
Results
Descriptive Statistics
Our sample consists of 33 fourth and fifth grade teachers from the six schools participating in the
intervention during the 2010-2011 school year. Our sample is 82% African American and 82% female. The mean
age of participants is 41.62 years and the average years of teaching experience is 13.
Effects of Institutes and Professional Development on Efficacy
In the first analysis, we examined the effects on efficacy at the end of the school year. In this analysis, we
examined the effects over the full school year. The dependent variable is self-rated efficacy using the XO as a tool to
teach in the classroom. Survey respondents were asked to rate their “Overall skill level using the XO laptop for
instruction.” Tab. 1 contains the basic tabulation of the responses. At the end of the year, 7 teachers ranked
themselves as beginners, 19 as intermediate, 6 as advanced and 1 as expert.
Table 1: Teacher Ratings of XO Skill
Our primary question of interest is the degree to which the summer institutes and professional development
affected efficacy. To study the effects, we estimated a series of ordered logit models. We measured summer institute
participation using a simple binary indicator (=1 if attended the summer institute, 0 otherwise). For professional
development, we developed an ordinal count measure that counts the number of professional development sessions a
teacher attended. This measure varies from 0 to 5. We used clustered standard errors to account for intraclass
correlation at the school level. Tab. 2 contains the results obtained using STATA. The first model estimates the
baseline effect of summer institute participation on efficacy. The effect is both positive and statistically significant.
Model 2 shows that the bivariate effect of professional development on efficacy is also positive and statistically
significant. Model 3 contains both variables together, showing that each has a significant partial effect.
Table 2: Ordered Logit Results for Teacher Efficacy
For Model 4, we were interested in examining the interactive effects of the summer institutes and
professional development. That is, what are the combined effects of both interventions on efficacy? Model 4
includes an interaction term which multiplies professional development by the institute variable. Finally, Model 5
includes the interaction as well, along with controls for years of teaching experience, gender and race.
The logic behind including the interaction term is that there may be a combined effect of both components.
The results suggest that there is a significant interaction effect. Interpreting the results is more straightforward if we
go back to how the term is calculated. The teacher institute variable is binary, so it equals 1 if the teacher attended
and 0 otherwise. The professional development variable is an ordinal measure that counts the number of professional
development sessions attended. So now we have two possibilities: a teacher that attended the institute, and: a teacher
that did not attend. To fix ideas, we define the coefficient on teacher institute as α, the coefficient on professional
development as β and the coefficient on the interaction term as λ. In the first case, the teacher did not attend the
institute, so the binary variable equals 0 and α drops out. The interaction term λ also drops out. Thus,
Efficacy|Nonparticipant = β*Number of PD Sessions
On the other hand, we have those that attended the institute. The effect of professional development for
them is different. Because the binary variable for institute participation is 1 for participants, the prediction becomes,
Efficacy|Participant = α + β*Number of PD Sessions + λ*Number of PD Sessions
Which simplifies to,
Efficacy|Participant = α + (β + λ) *Number of PD Sessions
The empirical results for Models 4 and 5 show some interesting interaction effects. As with the previous
models, both the summer institute and professional development have positive and statistically significant effects.
The interaction term, however, is negative. From the analysis above, the coefficient on the interaction term λ is
defined as the marginal benefit that institute participants receive from professional development over and above that
of nonparticipants. For example, using the results from Model 4:
Efficacy|Nonparticipant =.79 * Number of PD Sessions
Efficacy|Participant = 5.688 + (.79-1.005) * Number of PD Sessions
Seen in this light, these results make intuitive sense; those that did not attend the institute are deriving more
benefit from professional development than those that did attend the institute. But those that did attend the institute,
though they do not receive as much benefit from professional development, receive a substantial benefit from the
institute that non-participants did not receive. Finally, Model 5 includes the controls for years of teaching
experience, gender and race. Here we see that African American participants had a higher level of efficacy.
Effects of Institutes and Professional Development on XO Usage
A second question relates to the effects of the intervention activities on actual XO laptop usage in the
classroom. In our survey, we asked teachers how much they used each of the core applications on the XO laptops.
Tab. 3 shows the frequency of usage for each of the applications.
Table 3: Frequency of Usage for Core XO Applications
The numbers in the table indicate that XO usage was fairly infrequent. Very few teachers used XO
applications every day or even two to three times per week. The vast majority of teachers used the individual
applications less than once per week or not at all. In order to analyze how our interventions affected usage, we
created a summative scale based on application usage (alpha=.915). We used this scale as the dependent variable in
a series of regressions which mimic the specifications of the independent variables used earlier.
Table 4 reports the results. The effects of the interventions on XO usage show some interesting contrasts to
our models dealing with efficacy. We also used cluster robust standard errors in this model to account for withinschool correlation between teachers. The effect of the summer institutes is not as consistent. In the baseline model,
the effect is not statistically significant, nor is it significant in the third model, though it is significant in the last two
models. Professional development is statistically significant across all of our models, but in Model 5, the effect is
greatly diminished once we add in all of the control variables. The interaction term is only significant in the last
model, providing some evidence of a lesser effect of professional development on institute participants. The only
control variable that was significant was the indicator for African Americans.
Cluster robust standard errors in parentheses
* p < 0.10, ** p< 0.05, ***p <0.01
Table 4: OLS Regression Results for XO Application Usage by Teachers
These results paint a slightly different picture than those for efficacy. Here, the empirical evidence suggests
a less robust effect of the summer institutes on XO usage and a fairly consistent positive effect resulting from
exposure to professional development, particularly for those that did not attend the summer institutes. In contrast,
the efficacy models showed a consistent effect of the summer institutes upon efficacy and a lesser, but significant,
effect from professional development. Again, professional development seemed to primarily benefit those that did
not attend the institutes.
Conclusions
While professional development and ongoing mentoring have been shown to enhance teacher technology
self-efficacy across a number of studies (Brinkerhoff, 2006; Swan & Dixon, 2006), few studies have contributed to
an understanding of enhancing confidence in a platform which resembles traditional technologies but has an
unfamiliar interface. Further, little to no formal research has been done specifically for the incorporation of XO
laptops in the classroom. Most of the XO laptop research to date in the US has focused on student usage of the XOs
(Cotten et al., 2011) or evaluating the One Laptop Per Child program compared to other 1-to-1 computing programs
in the US (Warschauer et al., 2012).
Our results suggest that intensive summer institutes and follow up professional development can enhance
teacher self-efficacy; however, the interaction effects suggest that ongoing interventions do not improve efficacy for
teachers who participate in the initial interventions relative to their peers who do not participate. Instead, it seems
that their levels of efficacy regress towards the levels of their peers. We suggest that the needs of teachers who
participated in the summer institutes are different than those who did not. Sessions may have improved efficacy
among non-institute participants because they addressed beginners’ needs. However, such sessions would not be as
helpful for someone with a working knowledge of the technology. In other words, teachers who participated in
summer institutes might thus benefit from building on their existing knowledge through more advanced professional
development sessions. Even with enhanced efficacy, our results suggest that XO laptop usage levels were not
significantly impacted. The context within which this intervention occurred may account for this pattern. The school
system was in flux at the time as administrative leadership was changing, the system is a high poverty school
district, and there is continual pressure to enhance test scores. These structural factors affect the time and attention
that teachers can devote to implementing new technologies in the classroom, even when there is on-going
professional development designed to facilitate this integration. In order to translate efficacy into implementation,
technology interventions such the XO laptop project should come alongside teachers’ existing goals and provide
them with the necessary time and resources to develop their knowledge. Initial interventions can be expected to set
the foundation for both efficacy and implementation, and professional development, tweaked to individual needs
could be used to maintain efficacy and increase implementation.
Acknowledgements
This material is based on work supported by the National Science Foundation Discovery Research K-12 under Grant No.
0918216.
References
Brinkerhoff, J. (2006). Effects of a Long-Duration, Professional Development Academy on Technoogy Skills, Computer SelfEfficacy, and Technology Integration Beliefs and Practices. Journal of Research on Technology in Education, 39(1),
22–43.
Cotten, S. R., Hale, T. M., Moroney, M. H., O’Neal, L., & Borch, C. (2011). USING AFFORDABLE TECHNOLOGY TO
DECREASE DIGITAL INEQUALITY. Information, Communication & Society, 14(4), 424–444.
doi:10.1080/1369118X.2011.559266
Cuban, L. (2003). Oversold and Underused: Computers in the Classroom. Harvard University Press.
Ertmer, P. A., Conklin, D., & Lewandowski, J. (2001). Increasing Preservice Teachers’ Capacity for Technology Integration
through Use of Electronic Models. Retrieved from
http://www.eric.ed.gov/ERICWebPortal/contentdelivery/servlet/ERICServlet?accno=ED470081
Ertmer, P. A., & Ottenbreit-Leftwich, A. T. (2010). Teacher Technology Change: How Knowledge, Confidence, Beliefs, and
Culture Intersect. Journal of Research on Technology in Education, 42(3), 255–284.
Maloney, J., Burd, L., Kafai, Y., Rusk, N., Silverman, B., & Resnick, M. (2004). Scratch: A Sneak Preview. Proceedings of the
Second International Conference on Creating, Connecting and Collaborating through Computing (pp. 104–109). IEEE
Computer Society. Retrieved from http://portal.acm.org/citation.cfm?id=1009376.1009408
Maloney, J. H., Peppler, K., Kafai, Y., Resnick, M., & Rusk, N. (2008). Programming by choice: urban youth learning
programming with scratch. Proceedings of the 39th SIGCSE technical symposium on Computer science education,
SIGCSE ’08 (pp. 367–371). New York, NY, USA: ACM. doi:10.1145/1352135.1352260
Papert, S. A. (1993). Mindstorms: Children, Computers, And Powerful Ideas (2nd ed.). Basic Books.
Pellegrino, J. W., Goldman, S. R., Bertenthal, M., & Lawless, K. (2007). Teacher Education and Technology: Initial Results from
the “What Works and Why” Project. Yearbook of the National Society for the Study of Education, 106(2), 52–86.
doi:10.1111/j.1744-7984.2007.00115.x
Suur-Inkeroinen, H., & Seppanen, M. (2011). Effects of emotions and self-efficacy on technology usage behavior. Technology
Management in the Energy Smart World (PICMET), 2011 Proceedings of PICMET ’11: (pp. 1 –6).
Swan, B., & Dixon, J. (2006). The Effects of Mentor-Supported Technology Professional Development on Middle School
Mathematics Teachers’ Attitudes and Practice. Contemporary Issues in Technology and Teacher Education, 6(1), 67–
86.
Warschauer, M., Cotten, S. R., & Ames, M. G. (2012). One Laptop per Child Birmingham: Case Study of a Radical Experiment.
International Journal of Learning and Media, 3(2), 61–76. doi:10.1162/ijlm_a_00069
Download