Conducting Classroom Observations to Understand TPACK

advertisement
Conducting Classroom Observations to Understand TPACK: Moving Beyond
Self-reported Data
Shu-Ju Diana Tai and Denise Crawford
Center for Technology in Learning & Teaching, Iowa State University
United States
shujutai@gmail.com
dschmidt@iastate.edu
Abstract: This study reports the TPACK knowledge and competency observed in K-6 classrooms
of exemplary teachers with the purpose of designing an observation instrument that documents
specific teacher characteristics that align with the seven domains of TPACK. Participants are four
elementary teachers who were recommended by administrators or technology directors of schools
located in the Midwest in the US. Data were collected through two sources, observations and
interviews. Based on the analysis of the data sets, 39 main codes within seven domains of the
TPACK framework were identified (60 counting all subcodes). In addition to a full code name,
each code contains definitions with examples from observation and interview data. Findings from
the study provide a more in-depth way of understanding what TPACK “looks like” in a classroom
and what specific characteristics contribute to a teacher’s ability to apply and foster the interplay
between content, pedagogy and technology in classrooms.
While first generation TPACK research focused mainly on defining and conceptualizing the constructs of
the TPACK framework, more recent studies have switched their focus to using the framework to assess and measure
the levels and development of teachers’ TPACK. With more clearly defined and interpreted TPACK constructs,
research has evolved that uses the TPACK framework to comprehend teachers’ knowledge of integrating technology
to facilitate and enhance their teaching and if such knowledge develops through technology interventions. With the
purpose of understanding the impact of technology interventions in mind, researchers started to develop assessment
tools to measure teachers’ TPACK, mainly using survey instruments (Archambault & Crippen, 2009; Koehler &
Mishra, 2005; Schmidt et al., 2009a, 2009b). Such instruments as these encouraged a line of research that examined
teachers’ development of TPACK using different approaches focused on technology intervention and tools,
including an activity type structure approach (Harris & Hofer, 2009; Graham, Cox, & Velasquez, 2009), ICT
integration (Angeli & Valanides, 2009; Chai et al., 2010), learning by design (Lu et al., 2011), performance
assessments (e.g., Angeli & Valanides, 2009; Harris, Grandgenett, & Hofer, 2010), TPACK Web model (Lee & Tsai,
2010), and TPACK-in Practice (Figg & Jaipal, 2012).
In addition to survey instruments, some researchers have designed specific assessment instruments that
measure teachers’ TPACK, including a performance assessment rubric analyzing teachers’ lesson plans (Harris,
Grandgenett, & Hofer, 2010; Harris, Grandgenett, & Hofer, 2011), an observation rubric to score a video of teaching
(Hofer, Grandgenett, Harris & Swan, 2011), multiple types of TPACK-based content analyses (Hechter & Phyfe
2010; Koh & Divaharan, 2011), and verbal analyses (Mouza, 2011; Mouza & Wong, 2009). It appears that TPACK
can be assessed in different ways and at different times, taking into account diverse students and various educational
contexts (Koehler & Mishra, 2008). Because the TPACK framework is complex with several constructs working in
relationship to each other it has been suggested that just one data source would not be sufficient to measure teachers’
TPACK. In other words, data should be collected from different sources because “external assessment of those
practices and their artifacts, triangulated with the contents of teachers’ self-reports, should help us to better
understand the nature of their TPACK by inference” (Harris et al., 2010, p. 324).
Thus, the present study was designed to investigate teachers’ TPACK based on data collected from
observations and interviews in order to shed important light on what TPACK “looks like” in classrooms and what
specific characteristics contribute to a teacher’s ability to apply and foster the interplay between content, pedagogy
and technology in classrooms.
Research Methodology
The following sections describe the research methodology that was used for this study with the purpose of
developing an observation instrument that measures teachers’ observable behaviors in classrooms that align with the
seven TPACK domains. In the following sections more information about the participants, how data were collected,
the research procedures and the data analysis will be shared.
Participants
Participants of the study were four elementary (K-6) classroom teachers who were recommended by school
administrators and/or technology directors located in the Midwest in the U.S. The recommendation was based upon
the administrator’s and/or technology director’s evaluation of their performance related to the teachers using
technology in classrooms and the role that technology had on teaching and learning. Detailed information about each
teacher (e.g., demographics, teaching experience, etc.) will be provided in the final paper.
Data Collection & Research Procedures
This study used a qualitative case study approach where data were collected using two data sources,
classroom observations and individual teacher interviews. In order to collect systematic measures of observation and
interview data, two instruments were created before data collection started, including the observation protocol and
the interview guide (Borg & Gall, 1996; Borrego & Hirai, 2004).
After participants were identified and selected, an email invitation was sent to each nominated teacher. Four
teachers agreed to participate in the study. Researchers then worked with the teachers to schedule two separate
classroom observations for each teacher participant. In addition, a follow-up interview was conducted with each
teacher after each classroom observation to clarify some questions that emerged as a result of observing the teacher
in the classroom. The observations were documented using a TPACK observation protocol created specifically for
this study. Each interview conducted used a semi-structured approach that included a predetermined list of questions
and additional questions that resulted from observing the teacher in the classroom.
Data Analysis
Each interview was recorded and transcribed for data analysis. Before the actual data analysis, the
observation and interview data were sorted into units of meaning (UoM) for analysis based on Mohan’s (2007)
concept of a social practice. Each set of data was coded by two coders and the intercoder reliability was calculated
based on Mackey and Gass’s (2005) simple percentage agreement.
Grounded theory methodology (Strauss & Corbin, 1998) was used to guide the overall design of the
qualitative data analysis. Grounded theory methodology is an analytic approach that in this study employed a coding
system and comparative analysis to identify themes that emerged from the data collected. Thus, the data analysis
started with an open coding process to allow themes and codes to emerge. After the first observation and interview
were transcribed, researchers met for a preliminary analysis on the data. A definition for each TPACK domain was
given and the researchers used those definitions to guide their data analysis for identifying the codes that existed in
the seven TPACK domains. As a result of much discussion between researchers, themes and codes emerged and
were generated based on the seven domains within the TPACK framework.
After the first round of data analysis, a list of the codes for the TPACK domains were created and recorded
in a codebook. Each code contains the name of the code, a full code, a definition, examples from the data set, and
key words. The analysis was cyclical as this process was repeated after every teacher observation and interview. As
new themes and codes emerged they were added to the codebook. Currently, the codebook contains 39 main codes
(60 counting all sub-codes) that represent the seven domains within the TPACK framework. The data analysis in
continuing and will be completed and included in this paper.
Conclusion
This research study continues as the researchers complete data collection and analysis. In the proposed
SITE paper, the results of this study will be reported and those results will include an observation instrument/tool
that can be used to document observed TPACK behaviors that teachers use in classrooms. The value of designing
such an instrument will help validate, with observed behaviors, how teachers are actually applying TPACK and how
that knowledge is then used to facilitate classroom instruction and impact student learning. In this case, the
researchers plan to use the TPACK observation tool specifically with preservice teachers who will be in practicum
and student teaching placements in K-12 classrooms. These observation data, along with quantitative data collected
previously by surveys, will provide a triangulated approach to gathering evidence of teachers’ TPACK and their
actual classroom practices that reflect that knowledge.
References
Angeli, C., & Valanides, N. (2009). Epistemological and methodological issues for the conceptualization,
development, and assessment of ICT–TPCK: Advances in technological pedagogical content knowledge
(TPCK). Computers and Education, 52(1), 154-168.
Archambault, L., & Crippen, K. (2009). Examining TPACK among K-12 online distance educators in the United
States. Contemporary Issues in Technology and Teacher Education, 9(1), 71-88.
Chai, C. S., Hwee, J., Koh, L., & Tsai, C-C. (2010). Facilitating Preservice Teachers ’ Development of
Technological , Pedagogical , and Content Knowledge (TPACK). Educational Technology & Society, 13(1),
63-73.
Figg, C. & Jaipal, K. (2012). TPACK-in-Practice: Developing 21st Century Teacher Knowledge. In P. Resta (Ed.),
Proceedings of Society for Information Technology & Teacher Education International Conference 2012
(pp. 4683-4689). Chesapeake, VA: AACE.
Graham, C., Cox, S. & Velasquez, A. (2009). Teaching and Measuring TPACK Development in Two Preservice
Teacher Preparation Programs. In I. Gibson et al. (Eds.), Proceedings of Society for Information Technology
& Teacher Education International Conference 2009 (pp. 4081-4086). Chesapeake, VA: AACE.
Harris, J., & Hofer, M. (2009, March). Instructional planning activity types as vehicles for curriculum-based TPACK
development. In Society for Information Technology & Teacher Education International Conference (Vol.
2009, No. 1, pp. 4087-4095).
Harris, J., Grandgenett, N., & Hofer, M. (2010). Testing a TPACK-based technology integration assessment rubric.
In C. Crawford, D. A. Willis, R. Carlsen, I. Gibson, K. McFerrin, J. Price & R. Weber (Eds.), Proceedings
of the Society for Information Technology & Teacher Education International Conference 2010 (pp. 3833–
3840). Chesapeake, VA: AACE.
Hechter, R., & Phyfe, L. (2010, March). Using online videos in the science methods classroom as context for
developing preservice teachers’ awareness of the TPACK components. In Society for Information
Technology & Teacher Education International Conference (Vol. 2010, No. 1, pp. 3841-3848).
Hofer, M., Grandgenett, N., Harris, J. & Swan, K. (2011). Testing a TPACK-Based Technology Integration
Observation Instrument. In M. Koehler & P. Mishra (Eds.), Proceedings of Society for Information
Technology & Teacher Education International Conference 2011 (pp. 4352-4359). Chesapeake, VA:
AACE.
Koehler, M. and Mishra, P. (2008). Introducing TPCK. In: AACTE Committee on Innovation Technology (Eds.).
Handbook of Technological Pedagogical Content Knowledge (TPCK) for Educators (pp. 3-29). London:
Routledge Taylor and Francis Group.
Koehler, M. J., & Mishra, P. (2005). What happens when teachers design educational technology? The development
of Technological Pedagogical Content Knowledge. Journal of Educational Computing Research, 32(2),
131–152.
Koh, J. H., & Divaharan, S. (2011). Developing pre-service teachers' technology integration expertise through the
TPACK-developing instructional model. Journal of Educational Computing Research, 44(1), 35-58.
Lee, M. H. & Tsai, C. C. (2010). Exploring teachers’ perceived self-efficacy and technological pedagogical content
knowledge with respect to educational use of the World Wide Web. Instructional Science, 38(1), 1-21.
Lu, L., Johnson, L., Tolley, L., Gilliard-Cook, T. & Lei, J. (2011). Learning by Design: TPACK in Action. In M.
Koehler & P. Mishra (Eds.), Proceedings of Society for Information Technology & Teacher Education
International Conference 2011 (pp. 4388-4395). Chesapeake, VA: AACE.
Mackey, A., & Gass, S. M. (2005). Second language research: Methodology and design. Routledge.
Mohan, B. (2007). Knowledge Structures in Social Practices. In J. Cummins & C. Davison (Eds.), International
Handbook of English Language Teaching (pp. 303-315). Boston, MA: Springer US.
Mouza, C. (2011). Promoting Urban Teachers' Understanding of Technology, Content, and Pedagogy in the Context
of Case Development. Journal of Research on Technology in Education, 44(1), 1-29.
Mouza, C., & Wong, W. (2009). Studying classroom practice: Case development for professional learning in
technology integration. Journal of Technology and Teacher Education, 17(2), 175-202.
Schmidt, D. A., Baran E., Thompson A. D., Koehler, M. J., Mishra, P. & Shin, T. (2009a). Technological
pedagogical content knowledge (TPACK): The development and validation of an assessment instrument for
preservice teachers. Journal of Research on Technology in Education, 42(2), 123-149.
Schmidt, D. A., Baran, E., Thompson, A. D., Mishra, P., Koehler, M. J., & Shin, T. S. (2009b). Examining
preservice teachers’ development of technological pedagogical content knowledge in an introductory
instructional technology course. In I. Gibson, R. Weber, K. McFerrin, R. Carlsen, & D. A. Willis (Eds.),
Proceedings of Society for Information Technology and Teacher Education International Conference, 2009
(pp. 4145–4151). Chesapeake, VA: AACE.
Download