Exploring Relationships Among Teacher’s Schema of Effective Practice, Enacted Practice, & Student Learning A Study of Text-Based Writing Tasks in Reading Instruction Elaine Wang University of Pittsburgh Learning Sciences & Policy Ph.D. Program Milestone Two Presentation August 23, 2012 Defining the Issue curricula Schemas for teaching & learning = motives or goals Perceived Constraints policy Student Learning Image from http://reversethinking.typepad.com/weblog/brain/ Enacted Practice Purpose of Study This study aimed to investigate and generate hypotheses about the relationships among teachers’ schema of effective practice, their enacted practice, and student learning, specifically around text-based writing tasks in reading instruction at the fourth grade. Some questions the resulting hypotheses might address: Do particular teacher schemas seem associated with particular student learning? Which factors might moderate the relationship between teachers’ schema of effective practice and enacted practice? Which elements of schema of effective practice might be more susceptible to constraints in enacted practice? What might (mis)alignment between teacher’s schema and enacted practice mean for student learning? Significance of Study Better understanding of the relationship between teachers’ schema of effective practice and enacted practice could improve student learning Identification of constraints that hinder teachers from enacting ideal instruction could lead to interventions or PD that explicitly address these concerns and perceptions Recognizing role of teachers’ schema has implications for supporting instruction aligned with approach advocated by standards and frameworks Theoretical Frameworks Schema Theory Schemas help individuals understand the world by organizing one’s assumptions or accumulated knowledge into distinct and strongly interconnected patterns that are later accessed (Anderson, 1977; Bartlett, 1932; Piaget, 1926). Schemas have the potential to instigate action; they can function as motives or goals (d’Andrade, 1992). Areas of theoretical work on teaching reflecting this function of schemas: Mathematics teaching and learning (Ernest, 1988) Teacher decision-making framework (Bishop & Whitfield 1972) Policy implementation research (Coburn, 2004; Spillane, Reiser, & Reimer, 2002) Research on & Frameworks for Examining Writing Tasks Examination of writing task includes characterizing cognitive demand of prompt, rigor of evaluation criteria, accepted student responses (Doyle, 1983; Matsumura, 2003), and teacher feedback (Hattie & Gan, 2011; Hattie & Timperley, 2007) Prompt Feedback Instruction Writing Task Student Responses Evaluation Criteria Characterizing Cognitive Demand of Tasks Ambiguity and risk (Doyle, 1983; Doyle & Carter, 1984) Cognitive rigor (Matsumura et al., 2003) Taxonomy of Skills for Reading and Interpreting Fiction (Hillocks & Ludlow, 1984) Bloom’s Taxonomy Revised (Anderson & Krathwohl, 2011) Depth of Knowledge (Webb, 2002) Methods Research Design, Context & Participants Qualitative exploratory comparative case study (Yin, 1994); theorybuilding case study (Einsenhardt, 1989) Three 4th grade language arts teachers from three schools in a public district in mid-Atlantic state Second-year participants in larger project Sample representative of larger group of 59 teachers Name1 M/ F Race Degree Yrs. Exp. Class Size A-A Free/ Red. Lunch Prior Achiev. (Basic-Prof.-Adv.) Factor Score (%tile) Arlene F Cauc. Doct. 30 22 100% 73% 36%-50%-5% 42nd Christine F Cauc. Bach. 6 22 68% 55% 5%-64%-23% 73rd Julie F Cauc. Mast. 9 25 64% 9% 5%-68%-27% 26th Data Collection - - - Schema of Effective Practice Enacted Practice Perceived Constraints Student Learning Outcome Quantitative (from larger study) - 2 sets of survey items - 6 task ratings - 1 set of survey items - 2010 & 2011 Standardized test scores - RTA ratings (class set) Qualitative - 2 60-min semistructured interviews - 6 tasks - 2 60-min semistructured Interviews - RTA (class - RTA (class set & focal set & focal responses) responses) What ought to be the role or purpose of text-based writing tasks? What should an effective text-based writing task look like? Plus artifact-based q’s - Cover sheet Task Assessment scheme 4 pieces of graded student work (2 med., 2 high) - “Overall, how do you think Otis feels about his decision to hire the Tomcat? Explain…using 3-4 examples” (Correnti et al., 2012) Focal Students Sex A-A Free/Red Lunch Avg 2010 State Test Score Avg Gain Score 2010-11 Arlene n=6 4M, 2F 6 (100%) 5 (83%) 416.00 9 Christine n=6 3M, 3F 5 (83%) 1 (17%) 424.50 2 Julie n=9 1M, 8F 0 (0%) 1 (11%) 422.67 17 Qualitative Data Analysis Transcription Multiple re-readings, analytic memos, iterative inductive coding, constant comparative method Descriptive case study write-ups Cross-case comparisons with matrix displays (Eisenhardt, 1989; Glaser & Strauss, 1967; Miles & Huberman, 1994) Schema of Effective Practice: Sample Categories & Codes 1. Main goal of reading instruct. 2. Role of textbased writing tasks 3. Ideal form of text-based writing prompts 4. Ideal text-based writing prompts 1.1 Comprehend text literally 2.1 To confirm understanding of plot & characters 3.1 Brief constructed responses 4.1 Comprehension of discrete information 1.2 Apply strategies & skills 2.2 To convey in writing ideas from discussion 3.2 Response journals 4.2 Summary of plot events 1.3 Make text to self 2.3 To explore ideas 3.3 Extended connections freely essays 4.3 Descriptive Characterization 1.4 Understand big ideas/themes 2.4 To argue a point of view 3.4 Multiple-choice responses 4.4 Analysis of how text elements 1.5 Apply themes to real life 2.5 To form an opinion based on ideas in text 3.5 Graphic organizers 4.5 Opinion-based interpretation / application of text Enacted Practice: Sample Category Codes - amount - nature - level - cognitive demand - form of task - type of writing - skill - focal literary element - required use of text - pre-writing instructional activities Prompt Feedback Instruction Writing Task Evaluation Criteria Student Responses - quality of high vs. medium - variation - form of scheme - criteria - levels Perceived Constraints: Sample Categories & Codes State District School Classroom/ Teacher Policy/Organiz ational State testing Formative testing Department alization Rules & routines Curricular State standards Framework Pacing Support Resources Environmental General Climate General Climate Ability grouping Class size Human Resources Support Personnel Support Personnel Colleagues Support Personnel Inter-grade coordinat. Grading Instructional Intrapersonal (Buechler, 1991; Duffy & Roebler, 1986) Patience Teaching strategies Student Behavior Prior knowledge Student Learning Outcome Sample Categories Understanding of nuance of text Cognitive level at which students approach task Extent to which prompt is addressed Claims made Reasons given in support of claims Relevance and nature of textual evidence Explanation of inferences Findings & Discussion Schema of Effective Practice & Enacted Practice Coded Arlene Element Schema Christine Julie Enacted Schema Enacted Schema Enacted make thematic connection & application to real life express supported opinion apply big ideas to real life Main goal of reading inst. comprehend text literally /summarize accurately comprehend text literally, on a surface level learn & apply reading strategies & skills apply reading strategies Role of text-based writing tasks communicate freely, authentically; communicate ideas from discussion identify fragmented text info about characters and plot communicate ideas from discussion explain how elements help readers; comprehend plot Prompt: Cognitive Process summary / analysis basic comprehensi on analysis basic inference; BCR opinionbased interpretation opinionbased interpretation Prompt: Form authentic, nonformulaic, BCR graphic organizers multi-part, open ended; BCR BCR; graphic organizer open response short open responses express opinion about big ideas in text Representative Enacted Writing Task Prompt Form Focus Arlene Name three traits for a character, and identify one piece of textual evidence for each trait. Graphic Organizer Character Christine Explain how the author uses sensory details to help readers visualize. Provide supporting details from the text. BCR Author’s craft / Text element Julie Respond to one of four opinion-type questions on big ideas addressed in the story. Provide text support. (e.g., Why is it important for everyone to have something to believe in?) Short answer Big idea Schema of Effective Practice & Enacted Practice Coded Arlene Christine Julie Element Schema Enacted Schema Enacted Schema Enacted Instruction /Guidance discuss prompt; model w/other text discuss prompt; group work w/prompt; model w/text discuss prompt; group work w/prompt; model response discuss prompt; graphic org.; model w/text ensure und. of prompt discuss text; complete graphic org for text; no direct teaching of prompt Feedback Form/Proces s conference; peer feedback; “authentic” limited conferences post exemplary responses, comments in margins, conference, allow rewrite written comments conference, narrative comments, allow revisions discuss response w/peer& revise before submission, written comments Coherence Between Schema & Enacted Practice Coded Element Prompt Arlene Julie weak/med. med. strong Form weak med. med./strong Instruction Process strong strong strong Assessmen t Content med. med. med. Form weak weak strong n/a med. strong med. med. med. weak/med. med. strong Feedback Content Christine Content Form/Process Overall Trend Perceived Constraints State District Policy/Organiz ational ARLENE CHRIS. Curricular ARLENE Arlene Instructional School Classroom/ Teacher Julie JULIE Student CHRIS. Julie Arlene Intrapersonal CHRIS. Julie Julie Student Learning n Fully addresses prompt w/full text Partially addresses prompt w/part of text Characteriza tion of Character’s Feelings Summary Arlene 20 15% 25% 40% 10% 5% 5% Chris. 21 19% 38% 29% 5% 10% 0% Julie 22 45% 32% 5% 14% 5% 0% n Fully addresses prompt w/full text Partially addresses prompt w/part of text Personal Copied Response Text Characterization of Character’s Feelings Arlene 6 0 33% 66% Christine 9 17% 50% 33% Julie 9 56% 33% 11% Emergent Hypotheses of Relationships Among Constructs 1. Enacted practice at least partially aligns with or follows from schema of effective practice. 2. Enacted practice significantly influences student learning. 3. Perception of high-level policy-oriented constraints is associated with greater inconsistencies between schema of effective instruction and enacted instruction. 4. The content and form of the prompt (along with the feedback process) are most susceptible to perceived constraints. …Emergent Hypotheses of Relationships Among Constructs 5. Coherence among elements of the schema is associated with stronger practice and student outcome. 6. Prioritizing tasks requiring analysis or interpretation of text is associated with better student outcome. 7. A schema of effective instruction (and enacted practice) that focuses on providing extensive, explicit guidance on the given prompt hinders students’ development of higher-level thinking skills. Limitations Methodological Limitations Small sample size Inter-rater agreement pending Focus on text-based writing tasks Questions & Comments References References Anderson, R. C. (1977). The notion of schemata and the educational enterprise: General discussion of the conference. In R. C. Anderson, R. J. Spiro, & W. E. Montague (Eds.), Schooling and the acquisition of knowledge (1984). Hillsdale, NJ: Lawrence Erlbaum. Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching and assessing: A revision of Bloom's Taxonomy of educational objectives: Complete edition, New York: Longman. Bartlett, F. C. (1932). Remembering. London: Cambridge University Press. Bishop, A. J. & Whitfield, R. (1972). Situations in teaching. London: McGraw Hill. [out of print] Bloom, B. S. (1965). Taxonomy of educational objectives: The classification for educational goals. New York: David McKay Company. Buechler, M. (1991). Constraints on teachers’classroom effectiveness: The teacher’s perspective. Policy Bulletin. Bloomington, IN: Indiana University Education Policy Center. Coburn, C. E. (2004). Beyond decoupling: Rethinking the relationship between the institutional environment and the classroom. Sociology of Education, 77, 211-244. Correnti, R., Matsumura, L. C., Hamilton, L., & Wang, E. (2012). Combining multiple measures of students’ opportunities to develop analytic text-based writing. Educational Assessment. (in press). D’Andrade, R. G. (1992). Schemas and motivation. In R. G. D’Andrade, & C. Strauss (Eds.), Human motives and cultural models, pp. 23-44. New York, NY: Cambridge University Press. Doyle, W. (1983). Academic work. Review of educational research, 53, 159-199. Doyle, W, & Carter, K. (1984). Academic tasks in classrooms. Curriculum Inquiry, 14(2), 129- 149. Duffy, G. & Roebler, L. (1986). Constraints on teacher change. Journal of Teacher Education. 37(1), 55-58. Eisenhardt, K. M. (1989). Building theories from case study research. The Academy of Management Review, 14(4), 532-550. Ernest, P. (1989). The impact of beliefs on the teaching of mathematics. In P. Ernest (Ed.) Mathematics Teaching: The State of the Art, pp. 249-254. London: Falmer Press. Glaser, B., & Strauss, A. (1967). The discovery of grounded theory: Strategies of qualitative research. London: Wledenfeld and Nicholson. Hattie, J., & Gan, M. (2011). Instruction based on feedback. In R. E. Mayer & P. A. Alexander (Eds.), Handbook of Research on Learning and Instruction (pp. 249-27). New York: Routledge. Hattie, J. & Timperley, H. (2007). The Power of feedback. Review of Educational Research, 77(1), 81-112. Hess, K. K. (2009). Cognitive rigor matrix. Retrieved from http://www.nciea.org/publications/CRM_ELA_KH11.pdf Hillocks, G. Jr., & Ludlow, L. H. (1984). A taxonomy of skills in reading and interpreting fiction. American Educational Research Journal, 21(1), 7-24. Kepner, C. G. (1991). An experiment in the relationship of types of written feedback to the development of second-language writing skills. Modern Language Journal, 75, 305–313. Matsumura, L. C. (2002). Measuring instructional quality in accountability systems: Classroom assignments and student achievement. Educational Assessment, 8(3), 207–229. Matsumura, L. C., Garnier, H., Slater, S. C., & Boston, M. B. (2008). Toward measuring instructional interactions ‘at-scale’. Educational Assessment, 13(4), 267-300. Matsumura, L. C., Slater, S. C., Wolf, M. K., Crosson, A., Levison, A., Peterson, M., Resnick, L, & Junker, B. (2006). Using the Instructional Quality Assessment Toolkit to investigate the quality of reading comprehension assignments and student work. (CSE Technical Report #669). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing (CRESST). Miles, M. B. & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook. Thousand Oaks, CA: Sage. Newmann, F. M., Bryk, A. S., & Nagaoka, J. (2001). Authentic intellectual work and standardized tests: Conflict or coexistence. Chicago: Consortium on Chicago School Research. Piaget, J. (1926). The language and thought of the child. New York: Harcourt Brace. Piaget, J. (1971). Biology and knowledge. Chicago: University of Chicago Press. Schutzwohl, A. (1998). Surprise and schema strength. Journal of Experimental Psychology: Learning, Memory, and Cognition, 24, 1182-11. Spillane, J. P., Reiser, B. J., & Reimer, T. (2002). Policy implementation and cognition: Reframing and refocusing implementation research. Review of Educational Research 72(3), 387-431. Webb, N. L. (2002). Alignment study in language arts, mathematics, science, and social studies of state standards and assessments for four states. Washington, DC: Council of Chief State School Officers. Yin, R. (1994). Case study research: Design and methods (2nd ed.). Thousand Oaks, CA: Sage Publishing. Student Responses Prompt Writing Task Evaluation Criteria Instruction - cognitive demand - skill - form of task - focal literary element - type of writing - required use of text - quality of high vs. medium - variation Student Responses Prompt Writing Task Evaluation Criteria - criteria - levels Instruction - pre-writing instructional activities