Examining the Impact of Context on Preservice Teachers* Sense of

advertisement

Dr. S. Michael Putman

University of North Carolina at Charlotte

Michael.Putman@uncc.edu

Teacher Preparation

• Theory-Practice Disconnect

 Ball, Sleep, Boerst, & Bass, 2009; Grossman & McDonald,

2008

Field Experiences

 Zeichner, 2010

 Capraro, Capraro, & Helfeldt, 2010

Teaching Efficacy

• Bandura, 1997

Posnanski, 2007; Clift & Brady, 2005 •

Teaching Efficacy and Field Experiences

• Woolfolk Hoy & Spero, 2005; Knoblauch & Hoy, 2008

Oh, et al., 2005; Zeichner & Conklin, 2005

What is the impact of variations in programmatic delivery on the teaching efficacy of teaching candidates?

How do programmatic variations impact teaching candidates’ efficacy for classroom management, instructional strategies, and student engagement?

Elementary education majors admitted to the teaching curriculum

Combination of convenience and purposive sampling techniques (Teddlie & Tashakkori,

2009).

Two courses: foundations and practicum

Independent variable - specific delivery format

• looping (n = 25; 7 self-removed)

• blocked (n = 16) traditional (n = 25)

The Teachers’ Sense of Efficacy Scale (Tschannen-

Moran & Woolfolk Hoy, 2001)

• Two versions of the TSES – long form (24 items) and short form

(12 items)

TSES score - sum of most positive responses on items written along a 9-point continuum from 1 (nothing) to 9 (a great deal)

 Example: How much can you do to control disruptive behavior in the classroom?

Includes domain-specific subscales to measure efficacy in student engagement, instructional strategies, and classroom management

High overall reliability for scale ( α = .90) and sub-scales:

 student engagement ( α = .86)

 instructional strategies ( α = .81)

 Management ( α = .86)

Measurement at beginning of foundations and end of practicum for three delivery formats

1 st admin (n = 16)

TSES M

Total Score 67.06

SE 25.56

IS

CM

17.50

24.00

SD

8.32

4.08

2.73

3.83

2 nd admin (n = 16)

M

83.63

29.81

23.25

30.56

SD

8.83

3.76

2.35

3.75

Mean Score

Differences

16.57

4.25

5.75

6.56

t-test

(df=30)

5.46**

3.06**

6.38**

4.89**

Note .

SE = Student Engagement; IS =

Instructional Strategies; CM = Classroom

Management

* p < .05, ** p < .01

ANOVA #1 to investigate differences on scores at first administration

• Independent variable: context (looping, blocked, traditional)

• Statistically significant differences based on group membership at p < .01

 Total score (F = 23.65)

 Classroom management (F = 14.97)

 Instructional strategies (F = 19.12)

 Student engagement (F = 18.07)

Post hoc analysis - Tukey’s HSD

 Candidates enrolled in looping section signficantly higher in overall efficacy and for each domain-specific subscale

ANOVA #2 to investigate differences on final administration

• Independent variable: context (looping, blocked, traditional)

Statistically significant differences based on group membership at p < .01

 Total score (F = 16.89)

 Classroom management (F = 9.14)

 Instructional strategies (F = 23.97)

 Student engagement (F = 10.75)

Post hoc analysis - Tukey’s HSD

 Traditional program was significantly lower than looping and blocked groups

Blocked Section benefited from:

Multiple opportunities to implement instructional and management strategies described in coursework immediately in context

 Mastery and vicarious experiences

 Theory to practice connection

Continuity and coherence between program purposes and field experiences (see Hammerness et al., 2005)

 Vicarious experiences

 Reinforces selecting competent, skilled teachers for practicum

Direct access to a university supervisor, cooperating teacher, and peers at several points during the day

 Social Persuasion

 Access

Ball, D., Sleep, L., Boerst, T., & Bass, H. ( 2009). Combining the development of practice and the practice of development in teacher education. Elementary School Journal, 109(5), 458-474.

Clift, R. T., & Brady, P. (2005). Research on methods courses and field experiences. In M. Cochran-Smith, & K. M. Zeichner (Eds.),

Studying teacher education: The report of the AERA panel on

research and teacher education (pp. 309–424). Mahwah, NJ:

Lawrence Erlbaum Associates Publishers.

Grossman, P., Hammerness, K., & McDonald, M. (2009). Redefining teaching, re imagining teacher education. Teachers and Teaching,

Theory and Practice, 15(2), 273-289.

Grossman, P., & McDonald, M. (2008). Back to the future: Directions for research in teaching and teacher education. American

Educational Research Journal, 45, 184-205.

Hammerness, K., Darling-Hammond, L., & Bransford, J. (2005). How teachers learn and develop. In L. Darling-Hammond & J. Bransford

(Eds.), Preparing teachers for a changing world (pp. 358-389). San

Francisco: Jossey-Bass.

Knoblauch, D., & Hoy, A. (2008). “Maybe I can teach those kids.” The influence of contextual factors on student teachers’ efficacy beliefs.

Teaching and Teacher Education, 24, 166-179.

Oh, D. M., Ankers, A. M., Llamas, J. M., & Tomjoy, C. (2005). Impact of preservice student teaching experience on urban school teachers. Journal of

Instructional Psychology, 32(1), 82-98.

Posnanski, T. J. (2007). A redesigned Geoscience content course’s impact on science teaching self-efficacy beliefs. Journal of Geoscience Education,

55(2), 152-157.

Tschannen-Moran, M., & Woolfolk Hoy, A. (2001). Teacher efficacy:

Capturing and elusive construct. Teaching and Teacher Education, 17, 783-

805.

Woolfolk Hoy, A., & Spero, R. B. (2005). Changes in teacher efficacy during the early years of teaching: A comparison of four measures. Teaching and

Teacher Education, 21, 343-356.

Zeichner, K. (2010). Rethinking the connections between campus courses and field experiences in college- and university-based teacher education. Journal of Teacher Education, 61, 89-99.

Zeichner, K., & Conklin, H. (2005). Teacher education programs. In M.

Cochran-Smith & K. Zeichner (Eds.), Studying teacher education (pp. 645-

735). New York: Routledge.

Download