M.A., M.S.Ed
C O N T E N T S – F A C I L I T A T O R ’ S G U I D E
INTRODUCTION TO THE MASSACHUSETTS ADOLESCENT LITERACY PROFESSIONAL DEVELOPMENT
MODULES ..............................................................................................................................1
Section 1: An Overview of the Content and Structure of the Modules ........................................ 1
Section 2: Considerations for How to Use the Modules ............................................................... 6
Section 3: Sample Formats, Sequences and Use of Modules ....................................................... 9
In Conclusion ............................................................................................................................... 15
ACKNOWLEDGEMENTS ........................................................................................................ 19
CONTEXT FOR MODULE 3 ..................................................................................................... 21
UNIT 1: ASSESSING ASSESSMENT .......................................................................................... 28
Session 1: Assessment Foundations ............................................................................................ 34
Session 2: Balanced Assessment ................................................................................................. 44
Session 3: A Comprehensive Assessment Program..................................................................... 59
UNIT 2: DISCIPLINARY LITERACY ........................................................................................... 69
Session 1: Enhancing Assessment Practices ................................................................................ 71
Session 2: Formative Assessment ............................................................................................... 80
Session 3: Involving Students in the Assessment Process ........................................................... 88
UNIT 3: STUDENTS AT RISK ................................................................................................... 98
Session 1: The Role of Assessment for Students At Risk ........................................................... 100
Session 2: Literacy Screening and Diagnostic Assessment ........................................................ 112
Session 3: Progress Monitoring ................................................................................................. 129
LEADERSHIP ROLES AND RESPONSIBILITIES FOR MODULE 3: ASSESSMENT .......................... 152
Leadership Is Key to Improving Adolescent Literacy ...........Ошибка! Закладка не определена.
The Role of Leadership in Implementing the Four Adolescent Literacy Modules .................... 155
Leadership Roles and Responsibilities Specific to Support of Module 3 .................................. 158
Actions Literacy Leaders Can Take ............................................................................................ 161
Introduction to the Modules and Units
Welcome to the Massachusetts Adolescent Literacy Professional Development
Modules! We hope that the information in the Modules will be a valuable resource for supporting teachers and school leaders to gain insights into how to improve adolescent literacy learning in their schools.
There are four Professional Development Modules in the series. This set of resources is unique in its comprehensiveness and its specific focus on literacy improvement at the middle and high school level.
Each Module addresses a critical topic in the field:
Module 1 – Adolescent Reading, Writing and Thinking
Module 2 – Content-Area Literacy
Module 3 – Assessment
Module 4 – Tiered Instruction
Each Module is divided into three units, and each unit is further divided into 3 to
6 sessions. For each Module, we offer a Facilitator’s Guide, a Participant’s
Resource Packet, and a set of PowerPoint slides with extensive notes. The
Facilitator’s Guide provides a wealth of references and citations for further learning and specific suggestions for readings, guidance in facilitating conversations among participants using specific discussion protocols, links to web-based resources, and hints for how to deepen learning and understanding through application to practice and follow-up exercises. The Participant’s
Resource Packet is a collection of tools the participants can use to explore the topic of each session, including materials such as surveys, samples, and discussion protocols. The set of PowerPoint slides anchor the facilitator’s presentation with data, graphs, and discussion-generating questions.
Module 1: Adolescent Reading, Writing, and Thinking Page 1
Introduction to the Modules and Units
The content of each Module was developed in collaboration with the Office of
Literacy at the Massachusetts Department of Elementary and Secondary
Education (MA ESE). The Modules address four key topics about which educators need greater understanding and resources to address if they seek to improve adolescent literacy learning. Each Module is divided into three units, as follows:
Module 1:
Adolescent
Reading, Writing and Thinking
Module 2:
Content-Area
Literacy
Module 3:
Assessment
Module 4:
Tiered Instruction
Text, Activity and
Context
Content-based
Reading and
Writing Skills
Assessing
Assessment
Understanding and
Supporting Skilled
Reading
Cross-content
Reading and
Writing Skills and
Strategies
Disciplinary
Literacy
The State, National and International
Conversation about Adolescent
Literacy
Supporting
Struggling
Readers and
Writers
An Overview of
Tiered Instruction
Setting Up a
System of Tiered
Response
Students At Risk Key Components of Literacy
Interventions
The introductions to each Module and each unit provide context for the topic at hand. The Modules also provide information for how the content is applicable for English Language Learners as well as a specific Leadership Addendum that provides targeted resources and suggestions for how school leaders can a) support the implementation of actions related to the Module at their school and b) extend teacher learning through the Modules themselves.
Page 2 Adolescent Literacy Facilitator’s Guide
Introduction to the Modules and Units
The content of the Modules incorporates the recent research and practice literature in each of these important areas. The design of the learning experiences within each unit corresponds to what we know about effective adult learning and incorporates what we know about brain-based learning.
Finally, explicit connections within each Module to the Common Core Standards for English Language Arts make clear for professionals the high levels of relevance and applicability in each session topic.
Together, these materials provide a comprehensive professional development resource that has been designed to be as flexible as possible for use within a wide variety of learning contexts. We anticipate that knowledgeable administrators, coaches, and professional developers will strategically select and deliver units and sessions within units across multiple Modules to effectively meet the needs of their particular school communities. Suggestions for how facilitators may want to use the Modules, including sample schedules and selection criteria, are given in Section 3 below.
Structure of Each Module
To facilitate flexible delivery of the content, each unit is divided into 3 to 6 sessions. Each session — which includes enough content to engage participants for roughly two hours — provides a menu of resources that can be used and adapted to meet a variety of needs. Depending on the instructional context and format, some of the resources provided may not be necessary or workable. For instance, if using a study group format in which participants meet for an hour or two weekly, completing the suggested readings in the preparation section of each session prior to each meeting will be an essential component of the work.
Alternately, if a team were to be engaging in multi-day, lengthy professional development sessions, then the readings may be skimmed or quickly reviewed in order for participants to more quickly engage with PowerPoint slides and hands-on activities.
We mention this to acknowledge that facilitators are not expected to use every single resource or activity provided in each session; furthermore, we intentionally have provided more resources than most facilitators will likely be able to use in single sessions. It is our intention to provide enough resources for knowledgeable facilitators to choose those that are most congruent with each school’s and group’s instructional needs.
Module 3: Assessment Page 3
Introduction to the Modules and Units
When reviewing the materials, facilitators should pay attention to the following structure:
Session Objective
The objective contains the essential knowledge and key ideas that we hope participants will know or be able to do by the end of each session.
INTRODUCTION
This section contains a brief introduction to the session for facilitators, often including references to relevant research, theory, or practice resources.
BEFORE THE SESSION
This section includes references to articles or resources that we suggest both facilitators and participants read before presenting/attending the professional development sessions. Some resources are labeled “for further background information.” Facilitators may wish to read and review these, in addition to any required readings, to build background knowledge and enrich the delivery of the materials and activities in a given session. This section may also include “Notes to Facilitators,” specific tips for how to prepare for the session, as well as suggestions for materials participants may need to bring to the session in order to fully engage in particular activities (e.g., content-area textbooks, examples of student writing, etc.).
DURING THE SESSION
This section provides detailed descriptions of the primary activities available for each session. Typically, each session will include a warm-up, one or two core discussion or analysis activities that focus on main ideas from articles, relevant data sets, or online resources, and a concluding activity that connects the content to teacher practices or the processes of the school. All activities described in this section align with the PowerPoint slides that have been created for each session.
Note: The PowerPoint slides contain a great deal of information, on the slides themselves and in the “notes.” If facilitators choose not to use the PowerPoint slides (perhaps due to lack of technological resources, or due to small-group
Page 4 Adolescent Literacy Facilitator’s Guide
Introduction to the Modules and Units formats that are not conducive to using slides) then we strongly recommend making the slides available to participants as handouts.
AFTER THE SESSION
This section provides facilitators with suggestions for helping participants bring resources and ideas from the sessions back into their classrooms or other educational settings. We often provide additional strategies, handouts, and readings that expand upon session content. Also, in this section we provide instructions outlining any assignments (e.g., readings, data that should be collected, etc.) that participants should complete before the next session.
COMMON CORE CONNECTIONS
This section briefly highlights connections to the Common Core Standards and offers some thoughts on how the content within the session pertains to national expectations for College and Career Readiness.
REFERENCES AND ADDITIONAL RESOURCES
References are limited to those resources that are directly cited or used in the professional development sessions, while additional resources are suggested books, articles, and online resources that facilitators may wish to use in future sessions or when coaching teachers.
Development of the Modules
The Modules were developed in collaboration with staff from the
Massachusetts Office of Literacy under the direction of Joshua Lawrence and
Jacy Ippolito. Development was funded through a grant from the Massachusetts
Department of Elementary and Secondary Education and occurred throughout the spring of 2010. The Modules were written by Joshua Lawrence, Jacy
Ippolito, Patricia Newhall, and Keryn Kwedor. Information on English Language
Learners was provided by Claire White and her colleagues. Correlations to the
Common Core Standards were provided by Maura Wolk. The Leadership
Addenda were developed by Julie Meltzer and Melvina Phillips. PCG Education produced the Modules for distribution.
Module 3: Assessment Page 5
Introduction to the Modules and Units
Without a doubt, facilitators will need to take into account many intersecting factors when determining how to best use the Modules to provide quality professional development, including the budget for stipending participants, where and when the professional development will take place, and whether or not to serve snacks. Depending upon the goals of the professional development being offered, as well as the group of participants, facilitators might make very different choices about which sessions and units will be most helpful to use. We recommend that facilitators think carefully about the following four issues to ensure that the use of the Modules will meet the needs of a given group of participants:
1.
Student Demographics and Instructional Needs
These Modules are designed to be delivered to middle and high school teachers in schools serving a wide variety of students. Having some knowledge of a school’s particular student demographics (e.g., a large number of English-Language Learners) and particular instructional needs
(e.g., lack of academic vocabulary) can productively guide the use of these Modules as well as determining which content and supporting resources will be most relevant. We suggest that outside professional developers meet with school leaders and teams of teachers (e.g., literacy leadership teams) before delivering these Modules in order to discuss the needs of students being served by the participants. Professional developers working within the districts and/or schools in which the professional development will take place are encouraged to participate in a larger needs-assessment process (for a review of how to conduct a literacy needs assessment, see Vogt & Shearer, 2007, chapter 3).
2.
Teacher Demographics and Professional Development Needs
Teachers’ particular strengths and professional development needs should also be carefully considered when selecting which units and sessions to include along with which resources and learning experiences would be most helpful in conveying key ideas. While the Modules have been designed to introduce theory and practices related to adolescent literacy, tailoring can be done to adapt these materials to address the experience levels of the participating teachers. Additional factors that professional developers will want to think about when deciding how to use the resources provided in these Modules include:
Are the teachers primarily novices, veterans, or a mixture?
Page 6 Adolescent Literacy Facilitator’s Guide
Introduction to the Modules and Units
How many teachers will be attending the professional development sessions?
Will the same group of teachers attend each session?
Are the teachers responsible for one subject area, or multiple subject areas?
Are the teachers responsible for one grade level, or multiple grade levels?
How much flexibility do the teachers have when designing their curriculum?
The answers to such questions — determined with the help of a literacy leadership team, administrator, and/or literacy coach — can help professional developers strategically choose which Modules, units, and sessions to deliver, and how much adaptation will be necessary to meet particular teachers’ needs.
1.
Time Constraints
While a number of logistical elements may affect the successful delivery of professional development sessions, perhaps none is as influential as time. These Modules have been designed to be delivered in a variety of sequences (see Section 3 where we offer a few potential ways the
Modules might be delivered in multi-day, afterschool, or induction formats). Close attention should be paid to how much information to deliver to teachers over various stretches of time so as not to overwhelm or bore participants. If professional development is provided weekly, then Modules and individual units within Modules may be completed each semester with ease. If professional development is limited to monthly sessions, or multi-day institutes, then providers will need to consider student and teacher needs and strategically select which Modules, units, and sessions will have the greatest effect.
2.
Professional Development Format
Selecting a format for professional development delivery is a critical component of the planning process. Four resources we suggest consulting when deciding on the scope, sequence, and format for professional development are:
Irvin, J. L., Meltzer, J., & Dukes, M. S. (2007). Taking action on
adolescent literacy: An implementation guide for school leaders.
Alexandria, VA: ASCD. Chapters 6 and 9.
Module 3: Assessment Page 7
Introduction to the Modules and Units
Irvin, J. L., Meltzer, J., Mickler, M. J., Phillips, M., & Dean, N. (2009).
Meeting the challenge of adolescent literacy: Practical ideas for
literacy leaders. Newark, DE: International Reading Association.
Moran, M. C. (2007). Differentiated literacy coaching: Scaffolding for student and teacher success. Alexandria, VA: ASCD.
Walpole, S., & Beauchat, K. A. (2008) Facilitating teacher study groups.
The Literacy Coaching Clearinghouse. Retrieved from http://www.literacycoachingonline.org/briefs/StudyGroupsBrief
These resources collectively discuss the importance of connecting professional development to sustained systemic change efforts, respecting adult learners’ autonomy and substantial knowledge, allowing for choice, and carefully considering the degree to which the professional development is collaborative (i.e., how much teachers should guide the direction of the work versus how much the materials and/or Facilitator’s Guide the process). Moran, for example, highlights eight different ways of providing professional development for teachers, from less time-intensive methods such as providing resources
(e.g., distributing materials, articles, books, etc.) and providing literacy content presentations (e.g., Powerpoints and discussions), to more time-intensive methods such as organizing study groups, co-planning with teachers, peer coaching, conducting demonstration lessons, and co-teaching.
We strongly believe that these Modules will be most effective if used as part of a larger, comprehensive school- or district-wide effort to improve adolescent literacy instruction across content areas. This would include elements such as: regular team meetings, facilitator-led study groups, school-based instructional coaching, the participation of a literacy leadership team, and the participation of a data-analysis team.
The following resources may be of critical importance to schools considering how best to use these Modules within a larger, strategic professional development program:
Irvin, J. L., Meltzer, J., Dean, N., & Mickler, M. J. (2010). Taking the lead
on adolescent literacy: Action steps for schoolwide success.
Thousand Oaks, CA: Corwin Press.
Page 8 Adolescent Literacy Facilitator’s Guide
Introduction to the Modules and Units
Meltzer, J., & Jackson, D. (2010). Guidelines for developing an effective
district literacy action plan (Version 1.1). Malden, MA:
Massachusetts Department of Elementary and Secondary
Education and Public Consulting Group. Retrieved from http://www.doe.mass.edu/literacy/presentations/Literacy
Guidelines.pdf
We understand that time and resources are limited, and that the format for professional development may vary widely. Professional development might be offered in weekly or monthly 1-2 hour sessions, week-long institutes, or as part of an existing teacher-induction course or mentoring program. Given these realities, we strongly recommend that professional development providers strategically select the Modules and units that will most efficiently target the needs and interests of the teachers and students with whom they will be working. While the Modules and units have been designed to be delivered in a particular sequence, much attention has been given to also create the units and sessions as “stand alone” experiences so that facilitators might tailor the use of the resources to address particular, more urgent needs.
Once facilitators have identified the participants, specific student needs, and professional development format, it is quite easy to adapt the materials in the
Modules to meet the needs of a given group. Of course, the material in each
Module could be delivered “as is” in a course format with the sessions presented in sequential order over a period of several months. But oftentimes this may not be possible or desirable due to a variety of factors. Below we have provided a few scenarios as examples of how these materials might be used differently to meet particular school and district professional development needs.
Scenario # 1 — Two-Day Start of School Year Professional Development
A high school principal has decided that her faculty would be best served by exposure to these materials at the beginning of the school year during a twoday intensive workshop. The principal and literacy leadership team sought the support of multiple volunteer facilitators (e.g., coaches, department heads, teacher leaders) and small cross-content-area professional learning communities were formed for the first day, with teachers meeting in departmental groups the second day. Following recommendations made by the school’s literacy leadership team at the end of the previous school year (after an analysis of recent test data and student work), the principal agreed that a focus
Module 3: Assessment Page 9
Introduction to the Modules and Units on vocabulary instruction and improving assessment practices would be most important. Given this focus, the literacy leadership team chose the following sequence of materials:
Day 1 — What is at stake for students, and how do we support vocabulary growth?
Module 1, Unit 2: Understanding and supporting skilled reading
Using talk moves to enrich student academic language
Using discussion structures to enrich student academic language
Language development and academic vocabulary
Module 2, Unit 2: Cross-Content Reading and Writing Skills and
Strategies
Supporting Vocabulary Development in the Content Areas
Day 2 — How can we improve student achievement by improving assessment practices?
Module 3, Unit 1: Assessing Assessment
Assessment Foundations
Balanced Assessment
A Comprehensive Assessment Program
To make the case for the importance of collective action to improve adolescent literacy, the principal reviewed and selected a few resources and activities from
Module 1, Unit 3 to create a context for why this work is important and why she would be expecting teachers to take the learning from the next two days and apply it in all content-area instruction.
This plan for professional development would necessarily include multiple formal and informal mechanisms throughout the year to support teachers in reviewing and implementing new practices (e.g., ongoing formal mentoring or
Page 10 Adolescent Literacy Facilitator’s Guide
Introduction to the Modules and Units coaching, peer-coaching, professional learning communities, study groups, etc.).
Moreover, the savvy principal would devote some time during professional development days throughout the school year to revisit ideas and practices introduced during these two days.
Scenario # 2 — School-Wide Bi-Monthly Professional Development
A middle school principal has decided that her faculty would be best served by exposure to these materials twice per month during half-day professional development sessions. In consultation with the school’s literacy leadership team
(comprised of a reading specialist, a literacy coach, a special educator, the assistant principal, three content-area teachers, and the school’s guidance counselor), cross-content area teams of teachers have formed into long-term professional learning communities (PLCs). During the half-day professional development sessions, members of the literacy leadership team have volunteered to facilitate the professional development sessions for the various
PLCs (with the understanding that by mid-year, the PLCs would begin to facilitate the sessions themselves). Having reviewed last year’s test data, student work, and a brief faculty survey, it has come to the principal’s and leadership team’s attention that most content-area teachers never received formal instruction in content-area literacy practices, and they would like to know more about how to increase content-area achievement through literacyrelated practices. Thus it was collectively decided that the primary focus of the year would be to tackle Module 2, with small additions from the other Modules taken as needed.
September and October (4 half-days total):
Module 2, Unit 1: Content-Based Reading and Writing Skills
Day 1: What Does It Mean to Teach “Discipline-Specific” Literacy
Skills?
Day 2: Thinking like a Critic, Historian, Mathematician, and Scientist
Day 3: Reading like a Critic, Historian, Mathematician, and Scientist
Day 4: Writing and Presenting like a Critic, Historian,
Mathematician, and Scientist
Module 3: Assessment Page 11
Introduction to the Modules and Units
November and December (2 half-days total):
Module 2, Unit 2: Cross-Content Reading and Writing Skills and
Strategies
Structuring Lessons to Promote Comprehension
The Skills that Underlie Strategic Reading
January and February (2 half-days total):
Module 2, Unit 2: Cross-Content Reading and Writing Skills and
Strategies
Writing Across the Content Areas
Supporting Vocabulary Development in the Content Areas
March, April, May, & June (4 half-days total):
Module 2, Unit 3: Supporting Struggling Readers and Writers
Identifying Ways to Support Struggling Readers and Writers
Text Considerations, Part 1: Text Structure
Text Considerations, Part 2: Multiple Texts and Multiple Purposes
Using Graphic Organizers to Overcome Text Difficulty
This plan for professional development would necessarily include multiple formal and informal mechanisms throughout the year to support teachers in reviewing and implementing new practices (e.g., ongoing formal mentoring or coaching, peer-coaching, professional learning communities, study groups, etc.).
Also, particular PLCs might choose to focus more intensively on particular aspects of the Module, and the principal and literacy leadership team might encourage PLCs and teacher leaders to become resident experts on particular topics—expertise which then could be shared across groups.
Page 12 Adolescent Literacy Facilitator’s Guide
Introduction to the Modules and Units
Scenario # 3 — New Teacher Induction
A district might choose to use these Modules within the existing structure of a formal teacher induction or mentoring program or professional development course. As part of a weekly or bi-monthly course on adolescent literacy, these
Modules could easily be adapted to form the heart of a strong new teacher course curriculum. For teachers brand new to the profession, Modules 1 and 2 might make the most sense for yearlong study, as they introduce the basic premises behind adolescent literacy development and instruction. To differentiate for teachers new to a district, but not new to teaching, several sessions from Modules 1 and 2 could form a solid base, while most of the induction/mentoring could focus on improving assessment practices and solidifying knowledge of tiered instruction.
Fall Semester
Module 1, Unit 1: Text, Activity, and Context
Module 1, Unit 3: The State, National, and International Conversation about
Adolescent Literacy
Module 2, Unit 1: Content-Based Reading and Writing Skills
Module 3, Unit 1: Assessing Assessment
Module 3, Unit 2: Disciplinary Literacy
Spring Semester
Module 1, Unit 2: Understanding and Supporting Skilled Reading
Module 2, Unit 3: Supporting Struggling Readers and Writers
Module 3, Unit 3: Students at Risk
Module 4, Unit 1: An Overview of Tiered Instruction
Module 4, Unit 2: Setting up a System of Tiered Response
Module 3: Assessment Page 13
Introduction to the Modules and Units
As with the other professional development plans, this sequence would require in-school support (e.g., ongoing formal mentoring or coaching, peer-coaching, professional learning communities, study groups, etc.) in order for course concepts and activities to become part of teachers’ regular classroom practices.
Scenario #4: Tiered Instruction Team
The district leadership team has decided that Tiered Instruction (TI) has shown promise at the elementary level and wants to explore what a secondary model of TI would look like to improve reading/literacy at the middle school and high school level. The superintendent creates a TI Middle School Task Force and asks the committee to review best practices and make a recommendation to the administrative team at the end of the year. The TI Team reviews the resources in the Modules and realizes that deeper collective understanding of Tier 1 instruction and assessment will be needed to complete their assignment. They decide to meet weekly for 1.5 hours over a period of four months and to use the materials in the Modules in the following order:
Module 4: Units 1, 2 and 3
An Overview of Tiered Instruction
Setting Up a System of Tiered Response
Key Components of Literacy Interventions
Module 2: Unit 3
Supporting Struggling Readers and Writers
Module 3: Units 1 and 3
Assessing Assessment
Students At Risk
As the Task Force worked through the materials, members identified specific activities, resources and slides that they thought would be helpful for others to better understand the rationale for the recommendations for the specific TI design they recommended to the district leadership team. Thus they had a customized “toolkit” that they created from the resources in the Modules that could be used to communicate to others key messages and understanding about tiered instruction.
Page 14 Adolescent Literacy Facilitator’s Guide
Introduction to the Modules and Units
Version 1.0 of the Massachusetts Adolescent Literacy Professional Development
Modules was designed to be a key part of high-quality professional development efforts focusing on improving adolescent literacy learning.
However, the Modules themselves are only one part of the professional development experience. Knowledgeable facilitators who have experience working with adult learners will be crucial in order for the content to be delivered most effectively. Also, maximum effects will only be achieved when these materials are delivered in schools and districts that have already established time and structures for adults to collaborate and communicate. As part of a larger systemic focus on adolescent literacy and achievement, it is our hope that these materials can serve multiple purposes and spark many new ideas and practices in middle and high school classrooms across the state of
Massachusetts.
REFERENCES
Irvin, J. L., Meltzer, J., & Dukes, M. S. (2007). Taking action on adolescent literacy: An implementation guide for school leaders. Alexandria, VA:
ASCD.
Irvin, J. L., Meltzer, J., Mickler, M. J., Phillips, M., & Dean, N. (2009). Meeting the challenge of adolescent literacy: Practical ideas for literacy leaders.
Newark, DE: International Reading Association.
Meltzer, J., & Jackson, D. (2010). Guidelines for developing an effective district literacy action plan (Version 1.1). Malden, MA: Massachusetts
Department of Elementary and Secondary Education and Public
Consulting Group. Retrieved from http://www.doe.mass.edu/literacy/presentations/Literacy
Guidelines.pdf
Moran, M. C. (2007). Differentiated literacy coaching: Scaffolding for student and teacher success. Alexandria, VA: ASCD.
Vogt., M. J., & Shearer, B. A. (2007). Reading specialists and literacy coaches in the real world (2nd ed.). Boston, MA: Pearson.
Walpole, S., & Beauchat, K. A. (2008) Facilitating teacher study groups. The
Literacy Coaching Clearinghouse. Retrieved from http://www.literacycoachingonline.org/briefs/StudyGroupsBrief.pdf
Module 3: Assessment Page 15
Introduction to the Modules and Units
A word about the Common Core Standards for English Language Arts
Note: The Massachusetts Adolescent Literacy Professional development Modules were being written simultaneous to the development of the new Common Core
Standards. Although we make connections to the standards in each session, we still encourage readers to reference the standards independently and use them as a major resource alongside the professional development Modules. We have provided a brief introduction to those standards here.
The Common Core State Standards for English Language Arts & Literacy in
History/Social Studies, Science, and Technical Subjects
Background
In 2009, Massachusetts, along with 47 other states, the District of Columbia,
Puerto Rico, and the Virgin Islands, agreed to support the development of common standards in English language arts that would be designed to prepare students to be ready for college and careers. The K-12 Common Core State
Standards for English Language Arts & Literacy in History/Social Studies, Science,
and Technical Subjects were published in June 2010.
Key Design Elements
Grade levels for K–8; grade bands for 9–10 and 11–12
The Standards use individual grade levels in kindergarten through grade 8 to provide useful specificity; the Standards use two-year bands in grades 9–12 to allow schools, districts, and states flexibility in high school course design.
An integrated model of literacy
Although the Standards are divided into Reading, Writing, Speaking and
Listening, and Language strands for conceptual clarity, the processes of communication are closely connected.
Research and media skills are blended into the Standards as a whole
To be ready for college, workforce training, and life in a technological society, students need the ability to gather, comprehend, evaluate, synthesize, and report on information and ideas, to conduct original research in order to answer questions or solve problems, and to analyze and create a high-volume and extensive range of print and non-print texts in media forms old and new. The need to conduct research and to produce and consume media is embedded into
Page 16 Adolescent Literacy Facilitator’s Guide
Introduction to the Modules and Units every aspect of today’s curriculum. In like fashion, research and media skills and understandings are embedded throughout the Standards rather than treated in a separate section.
Shared responsibility for students’ literacy development
The Standards insist that instruction in reading, writing, speaking, listening, and language is a shared responsibility within the school. The K–5 standards include expectations for reading, writing, speaking, listening, and language applicable to a range of subjects, including but not limited to ELA. The grades 6–12 standards are divided into two sections, one for ELA and the other for history/social studies, science, and technical subjects. This division reflects the unique, timehonored place of ELA teachers in developing students’ literacy skills while at the same time recognizing that teachers in other areas must have a role in this development as well.
Part of the motivation behind the interdisciplinary approach to literacy promulgated by the Standards is extensive research establishing the need for college- and career-ready students to be proficient in reading complex informational text independently in a variety of content areas. Most of the required reading in college and workforce training programs is informational in structure and challenging in content; postsecondary education programs typically provide students with both a higher volume of such reading than is generally required in K–12 schools and comparatively little scaffolding.
Reading: Text complexity and the growth of comprehension
The Reading standards, divided into standards for literary and informational text, place equal emphasis on the sophistication of what students read and the skill with which they read. Whatever they are reading, students must also show a steadily growing ability to discern more from and make fuller use of text, including making an increasing number of connections among ideas and between texts, considering a wider range of textual evidence, and becoming more sensitive to inconsistencies, ambiguities, and poor reasoning in texts.
Writing: Text types, responding to reading, and research
The Standards acknowledge the fact that although some writing skills, such as the ability to plan, revise, edit, and publish, are applicable to many types of writing, other skills are more properly defined in terms of specific writing types: arguments, informative/explanatory texts, and narratives. The writing standards explicitly call for students to be able to write in response to what they read.
Because of the centrality of writing to most forms of inquiry, research standards
Module 3: Assessment Page 17
Introduction to the Modules and Units are prominently included in this strand, though skills important to research are infused throughout the document.
Speaking and Listening: Flexible communication and collaboration
Including but not limited to skills necessary for formal presentations, the
Speaking and Listening standards require students to develop a range of broadly useful oral communication and interpersonal skills. Students must learn to work together, express and listen to ideas, integrate information from oral, visual, and multimodal sources, evaluate what they hear, use digital media and visual displays strategically to help achieve communicative purposes, and adapt speech to context and task.
Language: Conventions and vocabulary
The standards on conventions and effective language use include the essential
“rules” of formal written and spoken English, but they also approach language as a matter of craft and informed choice among alternatives. The vocabulary standards focus on understanding words, their relationships, and their nuances and on acquiring new words and phrases, particularly general academic and domain-specific vocabulary.
Appendices A, B, and C
Appendix A contains supplementary material on reading, writing, speaking and listening, and language as well as a glossary of key terms. Appendix B consists of text exemplars illustrating the complexity, quality, and range of reading appropriate for various grade levels. Appendix C includes annotated samples demonstrating at least adequate performance in student writing at various grade levels.
The Standards are a project of the Council of Chief State School Officers and the
National Governors Association and are available at www.corestandards.org
and the Massachusetts Department of Elementary and Secondary Education website, www.doe.mass.edu
.
Page 18 Adolescent Literacy Facilitator’s Guide
Introduction to the Modules and Units
The creation of the Massachusetts Adolescent Literacy Professional
Development Modules has been a highly collaborative effort, and we would like to specifically thank some of the people who contributed greatly to the design and content of these materials.
Thank you to:
Cheryl Liebling, former Director of Literacy for the Department of Elementary and Secondary Education, MA, for her early vision and guidance.
Dr. Catherine Snow and Dr. Vicki Jacobs, Harvard Graduate School of Education, for their suggestions and materials.
Erica Garland and the students of EDU 260A at Salem State College for their suggestions regarding the use of multiple texts and feedback on the delivery of
Module 2.
Claire White, M. Catherine O’Connor, and members of the Strategic Education
Research Partnership (SERP) Boston Field Site Design Team.
Dr. Elizabeth Moje, Arthur F. Thurnau Professor of Literacy, Language, and
Culture in Educational Studies at the University of Michigan, Ann Arbor, MI.
The School Reform Initiative, which creates transformational learning communities fiercely committed to educational equity and excellence. We use their protocols and tools in many sessions.
See http://schoolreforminitiative.org/
Rebekah Lashman, the Commonwealth Corporation, and Executive Office of
Labor and Workforce Development.
Center for Labor Market Studies, Northeastern University.
Teachers and staff at the Timilty Middle School in Roxbury, MA.
Oneida Fox-Roye and the Literacy office of the Boston Public Schools.
Our colleagues and collaborators at the Massachusetts Department of
Elementary and Secondary Education, especially Susan Wheltle for writing the introduction to the Core Standards, and Dot Earle for continued support across the many phases of this work.
Module 3: Assessment Page 19
Introduction to the Modules and Units
Ron Ferguson and the Achievement Gap Initiative.
To the Landmark School Outreach Program for its commitment to professional development.
Page 20 Adolescent Literacy Facilitator’s Guide
A D O L E S C E N T L I T E R A C Y – P R O F E S S I O N A L
D E V E L O P M E N T
F A C I L I T A T O R ’ S G U I D E
In 2004, Biancarosa and Snow declared an adolescent literacy crisis in the
United States, which spurred focused attention on reading and writing instruction at the middle and high school levels. Modules 1 and 2 highlight the foundational thinking essential to designing curriculum and instruction that develop the skills and habits of mind toward academic proficiency, and that prepare students for higher education and the 21 st century workplace.
Consequently, the focus on teaching literacy requires a simultaneous focus on
assessing literacy. This Module invites consideration of the role that assessment should play in ensuring that all students are making effective progress toward proficiency.
Many teachers lack training in ways to accurately assess students in order to ensure they are making effective progress toward academic proficiency standards. As a result, high-stakes tests have often assumed the default job of identifying students who struggle academically, reflecting their progress toward standards, and measuring achievement in addition to fulfilling their intended role as accountability measures for schools and districts (Afflerbach, 2008). A
2010 survey finds that high-stakes assessment continues to be a “hot topic” even though most literacy leaders agree that it should not be(Cassidy, Valadez,
Garrett, & Barrera, 2010). Many critics, for example, posit a correlation between test scores and high school dropout rates (Nichols & Berliner, 2005; Shriberg &
Shriberg, 2006), and others find fault with the standards against which students are measured (Cronin, Dahlin, Adkins, & Kingsbury, October 2007).
The high-stakes test controversies, while interesting and informative in their own right, serve also to highlight several key issues in assessment that Module 3 introduces. While there is an important role for high quality, standards-based achievement tests, Brenner, Pearson, & Rief (2007) articulate the contradictions and challenges they pose:
Module 3: Assessment
Context for Module 3
Page 21
Context for Module 3
Page 22
There’s a mismatch in what teachers know that state assessments should do (measure individual students’ growth over time) and what state assessments actually do (measure one group against another from one year to the next). There’s a mismatch in what state assessments value (the ability to select the single correct answer) and what twenty-first-century workplace skills demand (the ability to formulate multiple answers to complex problems). There’s a mismatch in what policy makers say about assessment (“makes schools accountable”) and who is punished when scores don’t meet a particular point (students who are retained or are refused diplomas). And there’s a mismatch between what assessment could actually do (celebrate accomplishments) and what high-stakes assessments do
(highlight failures)…The list of mismatches goes on, but always with the same result: frustration…What is the responsible measure of a student’s success and how do I marry that with NCLB, high-stakes tests, a standardsdriven curriculum, and demands of the 21 st -century workplace (p. 259).
Quality assessment is a complex and dynamic process, not a particular test. A responsible and comprehensive assessment program must identify who needs what information, when, and in what form. The mismatches described above generally result when assessment is conceptualized as one-test-fits-everypurpose—a summative measure of achievement that can fulfill the data needs of all audiences. Instead, assessment must be balanced between valid and reliable measures of achievement of high standards (at the state and classroom levels), and embedded assessment activities that help teachers and students know exactly what steps must be taken to close the gaps between current and target knowledge and skills. Incidentally, the latter, known as formative assessment, is the focus of much current thinking in the field (Stiggins, April
2008; Torgeson & Miller, 2009; McManus, 2008).
Recent discussion about assessment has encouraged shifts in educators’ thinking. We must move away from the idea that assessment separates the successful from the unsuccessful, and toward the idea that assessments are part of instructional practices that move all students toward achieving mastery of standards (Stiggins, April 2008). Summative/achievement assessment evaluates learning after it has occurred. Too frequently, it is an endpoint for students and teachers, whether it is a state test or an algebra exam. Formative assessment, on the other hand, evaluates the effectiveness of teaching and learning while it is occurring, similar to the ways a coach observes and tailors guidance to the developing athlete, or an instructor watches and listens to a developing musician. As developing learners, adolescents, particularly those who struggle with learning, make startling gains in achievement when they and their teachers engage in formative assessment for learning (Chappius & Stiggins, 2009; Black &
Wiliam, 1998).
Adolescent Literacy Facilitator’s Guide
The distinction between assessment of learning and assessment for learning may seem arcane to many educators, but it is an important one—especially at a time when middle and high school students are falling alarmingly shy of reaching literacy proficiency (Biancarosa & Snow, 2006; Graham & Perin, 2007).
Formative assessment for learning asks that both teachers and students have a clear vision of overarching goals and their learning targets, accurate data about each student’s current knowledge and skill level in relation to those targets, and that teachers have the professional knowledge and skill that empowers them to bridge the gaps through targeted instruction informed by a variety of assessment activities. Formative assessment is a process rather than a particular tool or test. Under its umbrella fall literacy screening, diagnostic assessment, progress monitoring, and curriculum-based measurement, as well as classroom assessment and activities, all of which are introduced in this Module. (Stiggins,
Arter, Chappius, & Chappius, 2009; Chappius & Stiggins, 2009; Torgeson &
Miller, 2009; Wylie, 2008; Fuchs & Fuchs, 2006; Johnson, Mellard, Fuchs &
McNight, 2006; Black, Harrison, Lee, Marshall, & Wiliam, 2004; Deno, 2003;
Black & Wiliam, 1998).
Finally, while assessment of literacy skills per se rightly occupies our attention, we must not overlook other areas of assessment that impact literacy development. Intelligences and learning styles (Gardner, 2006), thinking styles
(Sternberg, 1997), and motivation (Pitcher, Albright, DeLaney, Walker,
Seunarinesingh, Mogge, et al., 2007; Lavoie, 2007) all impact learning, sometimes in ways of which students are unaware, and that are invisible to teachers. Together, the related domains (cognitive, affective, and experiential)
“represent important and powerful aspects of student learning, and they must be addressed if we are to have any hope of meeting struggling adolescent readers’ needs” (Afflerbach, 2008: 254).
The purpose of this assessment Module is two-fold:
1) to encourage thinking about types and uses of assessment, and
2) to invite educators to use assessment to enhance instruction.
First, the Module encourages participants to expand their thinking about assessment and its uses, about students’ roles in the assessment process, and about the strengths and needs of their schools’ assessment programs. Second, the Module aims to enhance instructional practice by reviewing what makes a good assessment, highlighting many ideas for implementing formative assessment for learning, and inviting participants to engage in a wide array of interactive and hands-on activities that increase knowledge and skill.
Module 3: Assessment
Context for Module 3
Page 23
Context for Module 3
Page 24
Changes in educators’ instructional practices occur in phases. First, educators must be aware of recommended best practices and willing to think about how these might look in their work with students. Second, educators must be open to trying out new approaches, reflecting on their effects, and refining their use.
Third, educators must decide what to implement, re-crafting their existing practice to incorporate new practices into the classroom activity.
This Module is not designed to provide training in any particular assessment method (although Unit 3 covers basic literacy screening and progress monitoring for reading fluency). Rather, it aims to invite teachers of all levels to explore recent thinking about adolescent literacy assessment, and to engage with the facilitator and their colleagues as they try out and reflect upon approaches to formative assessment for learning.
Module 3 is comprised of three units. Unit 1, appropriate for all interested educators, introduces participants to broad foundational thinking about assessment and invites them to examine their own and their school’s assessment practices. Unit 2, also appropriate for all educators, focuses on formative assessment for learning in the general classroom and invites participants to consider many different types of formative assessment activities.
Unit 3, appropriate for all interested educators, addresses formative assessment of students at risk, and focuses on literacy screening, diagnostic assessment, and progress monitoring.
The warm-up/initial activities in each session are intended to act as informal screening activities that can highlight for the facilitator the participants’ general knowledge and skill level related to assessment. The activities and discussions included in each session are intended to be formative assessment activities in which the participants’ responses guide the facilitator’s instruction. It is to the benefit of all participants that a comfortable classroom environment be created by the facilitator so that all participants may feel safe asking questions and sharing what they try in class and how it works out. As far as an “achievement” assessment of participants’ learning, the best approach would be a collegial observation in which teaching peers share with each other their goals and objectives for assessment, then observe each other and meet to discuss what went well and what next steps could be taken. Certainly, the facilitator could also create an instrument to assess assessment knowledge and administer to participants as a pre- and post-test.
Adolescent Literacy Facilitator’s Guide
COMMON CORE CONNECTIONS
The host of College and Career Readiness Standards for English Language Arts expect students to attain a level of mastery that is seldom identifiable with one type of assessment. In fact, the Standards are written to define “what students should understand and be able to do by the end of each grade,” suggesting a developmental process that ideally results in students’ mastery of particular literacy skills. Additionally, each of the Common Core Standards is broken into a set of component skills that must work in tandem in order to demonstrate the overarching skill. This means that careful, continuous assessment of student progress (after establishing a baseline) at each step is a strong asset to both teachers’ and students’ efforts toward the goal. While attainment of the
Common Core Standards is likely to be assessed with a standardized measure, either at the state or national level, the process of scaffolding students to such level of attainment will require thoughtful, diverse, and particularized assessments, both formative and summative. This holds true for the Common
Core Standards for Reading, Writing, Listening and Speaking, and Language in
English Language Arts and every other content area.
REFERENCES
Afflerbach, P. (2008). Meaningful assessment of struggling adolescent readers.
In S. Lenski, & J. Lewis (Eds), Reading success for struggling adolescent
learners. New York, NY: Guilford Press, 249-264.
Biancarosa, C. & Snow, C.E. (2006). Reading next—A vision for action and research in middle and high school literacy: A report to the Carnegie
Corporation of New York (2 nd ed.). Washington, DC: Alliance for
Excellent Education.
Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2004). Working Inside the Black Box: Assessment for Learning in the Classroom. (cover story).
Phi Delta Kappan, 86(1), 9-21.
Black, P., & Wiliam, D. (March, 1998). Assessment and classroom learning.
Assessment in Education, 7-71.
Brenner, D., Pearson, P. D., & Rief, L. (2007). Thinking through assessment. In K.
Beerns, R. E. Probst, & L. Rief, Adolescent literacy: Turning promise into
practice. Portsmouth, NH; Hienemann, pp. 257-272.
Cassidy, J., Valadez, C., Garrett, S., & Barrera, E. IV. (2010). Adolescent and Adult
Literacy: What's Hot, What's Not. Journal of Adolescent & Adult Literacy,
53(6), 448-456.
Module 3: Assessment
Context for Module 3
Page 25
Context for Module 3
Page 26
Chappuis, S., & Stiggins, R. (2009). Formative assessment and assessment for learning. In L. M. Pinkus (Ed.), Meaningful measurement: The role of assessments in improving high school education in the twenty-first
century. Washington, DC: Alliance for Excellent Education.
Cronin, J., Dahlin, M., Adkins, D., & Kingsbury, G. G. (October 2007). The
proficiency illusion. Thomas B. Fordham Institute and NWEA.
Deno, S. (2003). Developments in Curriculum-Based Measurement. Journal of
Special Education, 37(3), 184-192.
Fuchs, L. S., & Fuchs, D. (2006). Progress monitoring in the context of responsiveness-to-intervention. Portsmouth, NH: RMC Research
Corporation, Center on Instruction.
Gardner, H. (2006). Multiple intelligences: New horizons in theory and practice.
New York, NY: Basic Books.
Graham, S. & Perin, D. (2007). Writing next: Effective strategies to improve writing of adolescents in middle and high schools—A report to the
Carnegie Corporation of New York. Washington, DC: Alliance for
Excellent Education.
Johnson, E., Mellard, D. F, Fuchs, D., & McKnight, M. A. (2006). Responsiveness
to Intervention (RTI): How to do it. Lawrence, KS: National Research
Center on Learning Disabilities.
Lavoie, R. (2007). The motivation breakthrough: 6 secrets to turning on the
tuned-out child. New York, NY: Touchstone.
McManus, S. (Coordinator). (2008). Attributes of effective formative assessment.
Washington, DC: Council of Chief State School Officers.
Nichols, S. L., & Berliner, D. C. (2005). The inevitable corruption of indicators and
educators through high-stakes testing. East Lansing, MI: The Great Lakes
Center for Education Research & Practice.
Pitcher, S., Albright, L., DeLaney, C., Walker, N., Seunarinesingh, K., Mogge, S., et al. (2007). Assessing adolescents' motivation to read. Journal of
Adolescent & Adult Literacy, 50(5), 378-396.
Shriberg, D., & Shriberg, A. B. (2006). High-stakes testing and dropout rates.
Dissent, 53(4), 76-80.
Sternberg, R. (1997). Thinking styles. New Haven, CT: Yale University Press.
Stiggins, R. (April 2008). Assessment manifesto: A call for the development of
balanced assessment systems. Portland, OR: ETS Assessment Training
Institute.
Stiggins, R., Arter, J. A., Chappius, J., & Chappius, S. (2009). Classroom
assessment for student learning: Doing it right-using it well. Boston:
Allyn & Bacon, Inc.
Torgeson, J. K., & Miller, D. H. (2009). Assessments to guide adolescent literacy
instruction. Portsmouth, NH: RMC Research Corporation, Center on
Instruction.
Adolescent Literacy Facilitator’s Guide
Wylie, C. E. (2008). Formative assessment: Examples of practice. Washington,
DC: Council of Chief State School Officers.
Context for Module 3
Module 3: Assessment Page 27
Unit 1 Overview
Page 28
The target audience for this unit is all middle and secondary educators/administrators (4 th –12 th grades) interested in academic assessment.
INTRODUCTION
Adolescent literacy assessment is a complex topic that includes many types of assessment, used for differing purposes, to meet the needs of many audiences.
This unit introduces participants to broad foundational thinking about assessment and encourages them to examine their own and their schools’ assessment practices through different lenses. The goal is to encourage thinking and
Note to Facilitators:
When viewing or printing the slides associated with this Module, please choose “Notes” view, as many slides contain comments that are not included in the facilitator’s guide.
reflection in order to enhance capacity to select and design learning activities that measure progress toward learning targets, and to use data to guide instructional decision making. Unit 1 is comprised of three sessions.
In Session 1, participants consider the vast array of assessment activities in schools, examine the relationship of existing assessment programs to the goal of ensuring that all students gain literacy proficiency, and consider their own knowledge/confidence levels about assessment. The questions raised in this session reinforce the idea that assessment is a dynamic process, not an end in itself. The development of effective assessment practices require educators’ active and focused thinking about not only the standards to be achieved, but also how to identify, reach, and teach students who may be at risk for ineffective progress toward those standards.
Session 2 addresses literacy assessment specifically, and reviews the distinction between basic/intermediate and disciplinary literacy. Participants discuss the elements of a balanced assessment system (formative assessment for learning and summative assessment of learning), and have an opportunity to experience a literacy screening assessment. Activities in this session are focused on thinking like an assessor, considering classroom-based test validity, reliability, and fairness, identifying the principles of literacy assessment, and including students in the assessment process.
Adolescent Literacy Facilitator’s Guide
Session 3 focuses on the audiences/stakeholders in the assessment process, asking participants to consider who needs what data. This essential question guides not only the selection or design of assessment tools, but also the communication of data in a useful and meaningful way to students themselves, their parents, their teachers, and school and district leaders. Participants learn about the critical foundations of useful assessment and engage in a variety of activities aimed at enhancing their own classroom assessment practices, and thinking about their roles and responsibilities within the larger assessment program at their schools.
ELL INTRODUCTION
The number of English language learners (ELLs) in US public schools has grown exponentially compared to other student populations over the last two decades.
The decade between 1991 and 2001 indicated a 95% increase of ELLs in US public schools, while total enrollment only increased by 12% (Genesee,
Lindholm-Leary, Saunders, & Christian, 2005). Other statistics suggest that there were an estimated 5 million students identified as ELLs in the 2005-2006 school year, 10.3 % of the total K-12 student population (NCELA, 2008) . While this population of students increases, national assessment data has consistently shown an achievement gap between English language learners and English monolinguals on standardized achievement measures of reading comprehension for American eighth-graders. Under NCLB legislation, ELLs have become a focus of testing achievement, and the persistent gap in achievement scores has increased the level of concern in research, policy, and school districts. Language minority status is one indicator that has been identified as partially responsible for stark differences in reading ability.
Briefly, this fast-growing student population struggles at the low end of the achievement gap. Only 19% of those students classified as ELLs met state norms for reading in English (Kindler, 2002). NAEP reading tests scores indicate the percentage of ELLs scoring in the “proficient” and “advanced” categories is significantly less than English-only (EO) students (Lee et al., 2007, p. 59). Eighthgrade average scaled scores were 222 for ELLs and 263 for non-ELLs (Lee et al.,
2007, p. 67). More striking was the discrepancy between the percentage of each group scoring at or above the proficient level on the NAEP reading tests. Only
4% of ELLs in the nation scored at the proficient level in eighth grade, while 31% of EO students scored proficient (Lee et al., 2007, p. 67). In the Boston Public
Schools specifically, ELLs’ eighth-grade Math MCAS and tenth-grade ELA and
Math MCAS results are statistically significantly lower than their EO peers (Tung et al., 2009). It must be noted, however, that these disaggregated data are usually based on the common practice of identifying limited-English proficient
Module 3: Assessment
Unit 1 Overview
Page 29
Unit 1 Overview
Page 30 students 1 as representative of all ELLs in standardized assessments. In fact, the
LEP label is temporary, as students move out this status as they achieve English proficiency. Those students who leave English-language support then get aggregated into the mainstream population, so their (higher) test scores are no longer included in the disaggregated LEP population, making it difficult to get an accurate picture of ELL students’ achievement on national assessments (Francis et al., 2006). Despite the difficulties of classifying students, it is important that
ELLs are now included in state and national testing, both to keep a focus on the educational achievement of ELLs, and to make sure that state/district/school data are not distorting the percentages of students achieving proficiency (Francis et al., 2006).
The increased inclusion of ELLs in state assessments has led to better data on this population, particularly in light of standards-based reform initiatives across the US. Many states have implemented exit exams as graduation requirements for secondary students as part of these reforms (Darling-Hammond et al., 2005).
There have been concerns about reduced graduation rates resulting from the requirement of exit exams, particularly for struggling student populations like
ELLs (Darling-Hammond et al., 2005). One report on ELLs in Arizona shows the low pass rates on the state’s reading and math exit exams (Minnici et al., 2007), and these testing requirements for graduation may exacerbate the high school dropout rate of ELLs, due to validity and reliability issues of testing students in a language they are still acquiring. Nationwide disaggregated data are not common, but they do point to higher dropout rates for ELLs than non-ELLs
(NCELA, 2008), and educators should be concerned about the potential negative influence that high stakes assessments might have in this context.
One reason for the low scores of ELLs on standardized reading assessments in
English may be due to reliability and validity issues. It is important for educators to know when their English language learners are proficient enough to participate in English-language testing, and yet this is still an unresolved issue in research (Hakuta & Beatty, 2000), despite federal and state policies requiring that language-minority students be tested sooner than in the past (Garcia et al.,
2006). Many language proficiency measures do not always measure student knowledge and language use in real-world academic settings (Garcia et al.,
2006). Writing tests can pose reading challenges for students acquiring English, invalidating students’ content knowledge (Butler & Stevens, in Garcia et al.
2006). And standardized tests are usually normed on monolingual students in
1 Limited English proficient (LEP) is a term that is not commonly used now, as it labels bilingual students as having a deficit, downplaying the language skills they do possess. Educators and researchers more commonly use the terms English language learner(s) (ELL), language-minority students (LM), or bilingual students.
Adolescent Literacy Facilitator’s Guide
the US, creating test bias when administered to ELLs. Even when ELLs have access to tests in their native language, they are likely to be “parallel versions” of the English test, which may result in differences in vocabulary, word frequency, and psychometric properties (Garcia et al., 2006).
Another concern is the possible cultural and linguistic biases of literacy tests and assessments. Since language learners usually have differences in receptive and productive language proficiencies, ELLs likely comprehend more than they can demonstrate in their second language (Lee, 1986). “They may have welldeveloped cognitive skills that underlie comprehension, such as integrating background knowledge with textual knowledge or drawing inferences across proposition, but they cannot apply these skills to text because their limited
English proficiency interferes with their accessing enough of the text’s meaning to apply these skills” (Garcia et al., 2006, p. 584).
Based on second language acquisition research, we know that language plays an important role in academic literacy. Furthermore, we know that assessments of content knowledge are also tests of language proficiency, because language and knowledge are “inextricable” (Francis et al., 2006, p. 11). Assessment validity is an open question when ELLs are assessed in their second-language (L2): their scores may reflect L2 language abilities and not their content and conceptual knowledge. Controlling individual differences that are not the target of the assessment is commonly done through assessment accommodations for ELLs
(Francis et al., 2006), with the understanding that effective accommodations should improve the performance of ELLs and leave the performance of EO students unchanged (Abedi et al., 2004; Francis et al., 2006). Assessment accommodations for ELLs, while used for many standardized tests currently, are not well researched. Reviews by Abedi et al. (2004), and Francis et al. (2006) cautiously suggest the following accommodations as promising, but still underresearched, for leveling the test-taking playing field for ELL and EO students.
Abedi et al. (2004) found that modified English-language tests and dictionaries would seem to reduce the performance gap, and Francis et al. (2006) found that
English-language dictionaries and glossaries were the only ELL accommodation that produced statistically significant effect sizes in the studies they reviewed.
These findings indicated that extra time may also be a factor, as using a dictionary during an assessment demands more time. Use of bilingual dictionaries did not show an effect, though this finding may be mitigated by the variability in
ELL populations, L1 proficiency levels, and instruction. Educators cannot assume that ELLs will do better if they have access to native-language resources during assessments. Despite the lack of research in the area of assessment accommodations, it is important to consider that whichever accommodations are used, they must be the same used in regular classroom instruction to effectively provide the language learner with support during the assessment.
Module 3: Assessment
Unit 1 Overview
Page 31
Unit 1 Overview
ELL REFERENCES
Abedi, J., Hofstetter, C., & Lord, C. (2004). Assessment accommodations for
English language learners: Implications for policy-based empirical research. Review of Educational Research, 74(1), 1-28.
Francis, D., Rivera, M., Lesaux, N., Kieffer, M., & Rivera, H. (2006). Practical guidelines for the education of English language learners: Researchbased recommendations for instruction and academic interventions.
Portsmouth, NH: RMC Research Corporation, Center on Instruction.
García, GE, McKoon, G., & August, D. (2006). Synthesis: Language and literacy assessment. In D. August & T. Shanahan (Eds.), Developing literacy in second-language learners: Report of the National Literacy Panel on
language minority children and youth (pp. 583-596). Mahwah, NJ:
Lawrence Erlbaum Associates.
Genesee, F., Lindholm-Leary, K., Saunders, W., & Christian, D. (2005). English language learners in US schools: An overview of research. Journal of
Education for Students Placed at Risk, 10(4), 363-385.
Hakuta, K., & Beatty, A. (2000). Testing English-language learners in US schools.
Washington, DC: National Academies Press.
Kindler, A. L. (2002). Survey of the states’ limited English proficient students and available educational programs and services: 2000–2001 summary
report. Washington, DC: National Clearinghouse for English Language
Acquisition.
Lee, J. (1986). Background knowledge & L2 reading. Modern Language Journal,
70(4), 350-354.
Lee, J., Grigg, W., & Donahue, P. (2007). The nation's report card: Reading 2007
(NCES 2007-496). Washington, DC: National Center for Educational
Statistics, Institute of Educational Sciences, US Department of
Education.
Minnici, A., Zabala, D., & Bartley, A. (2007). Caught in the middle: Arizona's
English language learners and the high school exit exam: Center on
Education Policy.
NCELA. (2008). National Clearinghouse for English Language Acquisition and
Language Instruction Educational Programs. Retrieved February 22,
2009, from http://www.ncela.gwu.edu/expert/faq/08leps.html
Tung, R., Uriarte, M., Diez, V., Lavan, N., Agusti, N., Karp, F., et al. (2009). English learners in Boston public schools: Enrollment, engagement and
academic outcomes, AY2003-AY2006. Boston, MA: University of
Massachusetts.
Page 32 Adolescent Literacy Facilitator’s Guide
ELL ADDITIONAL RESOURCES
For an overview of ELLs participation in high-stakes tests:
Coltrane, B. (2002). English language learners and high-stakes tests: An overview of the issues. Center for Applied Linguisitics. EDO-FL-02-07. http://www.cal.org/resources/digest/0207coltrane.html
For a short overview of ELLs and federal testing requirements:
Cech, SJ. (2009 ). Testing tension: Weigh proficiency, assess content. Education
Week, 28(17), 6-8. http://www.cal.org/qualitycounts/index.html
Unit 1 Overview
Module 3: Assessment Page 33
Unit 1: Session 1
Page 34
GUIDING QUESTION/OBJECTIVE
What role does assessment play in decision-making related to improving adolescent literacy?
What is assessment?
Why should we assess?
What should we assess?
To understand that assessment tools are one part of an overall assessment process designed to ensure that all students gain proficiency
INTRODUCTION
Thoughtfully-designed assessment, when administered and interpreted appropriately, can serve a variety of purposes for different audiences. This session asks participants to examine assessment in general from a pedagogical perspective with the goal of recognizing what roles assessment currently play in our schools/classrooms, and in what ways we might expand our understanding and uses of assessment to enhance the success of our school as a whole, our own pedagogies, and our students’ academic progress.
Adolescent Literacy Facilitator’s Guide
are vehicles we use to gather data (e.g. observation, test, MCAS).
process refers to the decisions we make and actions we take as we prepare and administer tools, and interpret and communicate data.
assessment is an umbrella term that encompasses a variety of assessment tools and activities designed to gauge skills and knowledge related to receptive and expressive language (listening, speaking, reading, and writing).
are reference points used for evaluation. The word is used commonly in education, but it can mean different things at different times, and can be measured using different types of assessments. State standards are reflected in the Massachusetts Frameworks
( http://www.doe.mass.edu/frameworks/current.html
), and more recently in the Common Core Standards ( http://www.corestandards.org
).
Additionally, some schools and districts have specific standards and graduation requirements they measure with their own assessments.
Finally, there may be departmental and class standards that are measured as well.
measures progress during learning. It is aimed at guiding instruction to ensure standards will ultimately be met.
measures achievement after learning. It is aimed at reporting out results of whether or not standards were met.
tests measure a student’s achievement in relation to a set standard.
tests measure a student’s achievement in relation to other students’ performance on the same assessment.
means that an assessment measures what it is supposed to measure.
Unit 1: Session 1
Module 3: Assessment Page 35
Unit 1: Session 1
Page 36
means that the same or similar scores will be evidenced regardless of when the assessment occurs or who does the scoring.
ELL Connections:
For an overview of assessment issues and English language learners in US schools, view this webcast featuring Lorraine
Valdez Pierce, L. Assessment of English language learners.
Colorín, Colorado. http://www.colorincolorado.org/webcasts/assessment
BEFORE THE SESSION
Read Carnegie Council on Advancing Adolescent Literacy. (2010). Time to act: An agenda for advancing adolescent literacy for college and career success . New York, NY: Carnegie Corporation of New York. pp. 1-5, “The
Vision: Literacy for All.”
Read Tanner, J. (2009). College and work readiness as a goal of high schools: The role of standards, assessments, and accountability. In Pinkus,
L.M., ed. (2009). Meaningful measurement: The role of assessments in improving high school education in the twenty-first century. Washington,
DC: Alliance for Excellent Education. http://www.all4ed.org/publication_material/reports/meaningfulmeasure ment Chapter 1, pp. 9-23.
Review Pinkus, L.M., ed. (2009). Meaningful measurement: The role of assessments in improving high school education in the twenty-first century. Washington, DC: Alliance for Excellent Education. http://www.all4ed.org/publication_material/reports/meaningfulmeasure ment , p. iii and pp. 1-8.
Review Carnegie Council on Advancing Adolescent Literacy. (2010). Time to act: An agenda for advancing adolescent literacy for college and career readiness.
New York, NY: Carnegie Corporation of New York . pp. 7-15.
Adolescent Literacy Facilitator’s Guide
Consider looking at Cronin, J., Dahlin, M., Adkins, D., and Kingsbury, G. G.
(October 2007). The proficiency illusion. Retrieved 4/1/10 from http://www.edexcellence.net/publicationsissues/publications/theproficiencyillusion.html
, pp. 2-7 and pp. 109-113.
Consider looking at Sternberg, R. J. (1997). Thinking styles. New York, NY:
Cambridge University Press.
Print Participant readings for Session 1 to distribute prior to the session.
Print Participant readings for Session 2 to distribute at the end of the session.
Procure Materials needed for Session 1 (See Materials below).
Participants
Read Carnegie Council on Advancing Adolescent Literacy. (2010). Time to act:
An agenda for advancing adolescent literacy for college and career readiness.
New York, NY: Carnegie Corporation of New York.
from http://carnegie.org/fileadmin/Media/Publications/PDF/tta_Main.pdf
March
2009, pp. 1-5, “The Vision: Literacy for All.”
Read Tanner, J. (2009). College and work readiness as a goal of high schools: The role of standards, assessments, and accountability. In Pinkus, L.M., ed. (2009).
Meaningful measurement: The role of assessments in improving high school education in the twenty-first century. Washington, DC: Alliance for Excellent
Education. http://www.all4ed.org/publication_material/reports/meaningfulmeasurement
Chapter 1, pp. 9-23.
Print Sternberg-Wagner Thinking Styles Inventory from http://www.ldrc.ca/projects/tscale/tsint.php
and bring with you to the session.
Print What Are My Learning Strengths? From http://www.ldrc.ca/projects/miinventory/mitest.html
Module 3: Assessment
Unit 1: Session 1
Page 37
Unit 1: Session 1
Page 38
Small sticky notes for participants
Wall sized sticky papers (Post-It recommended)
DURING THE SESSION
Participants will engage in a Think, Talk, Write protocol about associations with the term “assessment,” and then complete the survey. The purpose of this activity is to encourage participants to consider all the kinds of assessment we use. The Talk part of this activity captures what participants have in mind when we use the word “assessment.” The Write part of this activity, a checklist of assessments that is included in the Participant’s Resource Packet, raises awareness of the many types of assessment students experience in school. It encourages participants to begin expanding their thinking about assessment beyond tests, quizzes, projects, etc.
Using Slide 4 and the following slide, The Assessment Process, distinguish between assessment tools (e.g., observations, student conferences, student assignments, tests, essays, projects, standardized interim assessments, MCAS,
PSAT, etc.) and the assessment process. The goal of this section is to encourage participants to think about assessment as a process rather than a product used to determine a grade or percentile rank.
Note that assessment is an umbrella term that covers literacy assessment, the focus of this Module. Literacy assessment is, itself, an umbrella term within the larger assessment umbrella. It encompasses a variety of assessment tools and activities designed to gauge skills and knowledge related to receptive and expressive language (listening, speaking, reading, and writing).
Adolescent Literacy Facilitator’s Guide
1.
As a group, discuss the purposes for assessment in each category—
What type of information do we want from it? What do we plan to do with the information?
2.
Ask participants to cite examples of each type of assessment listed on the slide. Participants might refer to the survey they completed for ideas. Here are some examples:
• Knowledge/skill level prior to instruction: surveys, pretests, screenings, brainstorms
• Achievement at the end of an instructional period: tests, essays, projects, presentations, quizzes
• Preferences for how to learn and demonstrate learning: inventories for learning and thinking styles, interviews/reflections with students, observations and note taking on student performance during different tasks and in different settings
• Progress during instruction: curriculum-based measurement, observations, Q & A, assignments that do not count toward a grade, practice tests
In order to categorize the examples visually, consider putting up one wall-sized Post-it for each category listed here, and have participants write examples on each.
3.
Discuss which types of assessment activities are used most frequently, and talk about the pros and cons of each category with respect to how they inform classroom instruction.
The focus of the next several slides is to broaden participants’ thinking about schools’ responsibilities to students (as reflected in NCLB) and the different role that assessment plays in the new way of thinking. The goal is for participants to consider assessment in broader terms than achievement assessments
(summative assessments) in order to include assessments that enhance teaching and learning while it is happening (formative assessment). The slides about effectiveness, efficiency, and equity all speak to this broader thinking about formative assessment, which will be addressed in further detail in Session 2.
These slides aim to establish a shared assessment vocabulary. These six terms provide some foundation for discussing assessment from differing points of view.
Module 3: Assessment
Unit 1: Session 1
Page 39
Unit 1: Session 1
Page 40
After reading, “College and Work Readiness as a Goal of High Schools: The Role of Standards, Assessments, and Accountability,” have participants use the Text
Rendering Experience
( http://www.nsrfharmony.org/protocol/learning_texts.html
) as a structure to discuss and reflect. Consider following up at the end of the activity with the following questions:
1.
Who is the audience for this article?
2.
What types of assessment does the article address?
3.
How can the ideas in this article inform the decisions we make about how to use assessment in the classroom?
These last slides provide a framework for thinking about participants’ role as assessors in educating students. These are questions that may be revisited in different ways throughout the Module, and they ask implicitly that participants bear in mind the idea that assessment is a dynamic process, not an end in itself, and that it requires educators’ active and focused thinking about not only the standards that are to be achieved, but also the best ways to identify, reach, and teach students who may be at risk for insufficient progress.
This activity asks participants to examine the “ideal high school” as presented in the reading from Time to Act, “The Vision: Literacy for All.”
1.
As a group, identify examples of assessments used at the ideal school.
2.
Categorize these examples on the Riverside High School Handout.
3.
Discuss: a.
How does the assessment focus in this reading differ from the focus in the other reading? b.
In what ways does this reading encourage us to expand our views of assessment and its purposes?
These directions appear on the handout:
1.
Review the section on Riverside High School in the reading “The Vision:
Literacy for All.”
Adolescent Literacy Facilitator’s Guide
2.
As a group, identify examples of assessments used at the ideal school and write these in the categories below. Discuss: a.
How does the assessment focus in this reading differ from the focus in the other reading? b.
In what ways does this reading encourage us to expand our views of assessment and its purposes?
Note: The categories along the top row ask participants to consider assessment practices (in the broadest terms) through the perspective of the five questions presented in Session 1. The categories along the first column ask participants to consider assessments targeted at eliciting data at the programmatic level (schools), at the
Note to Facilitators:
This activity is optional, but we recommend it. The two pieces described here are available for use by individuals only; therefore, they cannot be part of a photocopied handout. In order to do this activity, participants will have had to access and print both pieces from the internet prior to this session. professional development level
(teachers), and at the instructional level (teachers and students).
Each of us has preferences for how we take in and process information. Our preferences influence how we teach as well. This is a sample of an assessment that addresses Question 3: “How do our students learn best?”
For participants, it is useful to consider how individual preferences influence their own teaching styles. These types of assessments are also very important for students to take in order to gain a clearer understanding of their own strengths and preferences related to learning and performance. Please tell participants that there is an online inventory for thinking styles available at http://www.ldrc.ca/projects/tscale/
Participants should have with them the overview of Thinking Styles
(Sternberg, 2007) and the learning styles inventory (Gardner, 1983).
Ask participants to read over the summaries of thinking styles and take the learning styles self-assessment.
Afterward, participants should pair up with a partner and discuss how data from this type of assessment could enhance their teaching.
Bring the whole group together at the end to share any insights gained from the activity.
Module 3: Assessment
Unit 1: Session 1
Page 41
Unit 1: Session 1
Page 42
AFTER THE SESSION (FOR NEXT TIME…)
Bring 2-3 examples of assessments you use in your classes.
Choose one or more activities:
Review the five questions and reflect on which questions occupy most of your focus. For a week or so, experiment with shifting your thinking to the other questions, and come to the next session prepared to share whether and how this activity reflected on the teaching and learning in your classroom.
Consider surveying your students on their thinking and learning styles and asking them to reflect on their learning experiences through this lens.
Consider taking the Sternberg Thinking Styles online learning styles assessment at http://www.ldrc.ca/projects/tscale/ . This is a comprehensive inventory (104 questions) that requires you to provide an email address to receive your results. Jot down some notes about how your thinking style influences your teaching style, and come to the next session prepared to share insights on teaching and learning through this lens.
REFERENCES
Afflerbach, P. (2008). Meaningful assessment of struggling adolescent readers.
In S. Lenski, & J. Lewis, J. (Eds.), Reading success for struggling
adolescent learners (pp. 249-264). New York, NY: Guilford Press.
Carnegie Council on Advancing Adolescent Literacy. (2010). Time to act: An agenda for advancing adolescent literacy for college and career success .
New York, NY: Carnegie Corporation of New York.
Deno, S. (2003). Developments in curriculum-based measurement. Journal of
Special Education, 37(3), 184-192 at
Gardner, H. (1983). Frames of mind. New York, NY: Basic Books.
Pinkus, L. M.(Ed.) (2009). Meaningful measurement: The role of assessments in
improving high school education in the twenty-first century.
Washington, DC: Alliance for Excellent Education. http://www.all4ed.org/publication_material/reports/meaningfulmeasur ement
Adolescent Literacy Facilitator’s Guide
Sternberg, R. (1997). Thinking styles. New Haven, CT: Yale University Press.
Stiggins, R. (April 2008). Assessment manifesto: A call for the development of
balanced assessment systems. Portland, OR: ETS Assessment Training
Institute. http://www.doe.mass.edu/NCLB/ .
ADDITIONAL RESOURCES
Assessment and Evaluation Links http://www.adlit.org/researchbytopic/c112
Practical Assessment, Research and Evaluation. An online free-access journal http://pareonline.net/
Unit 1: Session 1
Module 3: Assessment Page 43
Unit 1: Session 2
Page 44
GUIDING QUESTION/OBJECTIVE
What role does assessment play in decision making related to improving adolescent literacy?
In what ways can we expand our understanding of the purposes and uses of assessment beyond quizzes and tests?
What is literacy assessment?
What are the elements of a balanced assessment model?
How can I begin to enhance my use of assessment in the classroom?
To understand the elements of a balanced assessment model and examine teachers’ own assessment practices from this point of view
To consider what can be done to enhance their uses of assessment in the classroom
INTRODUCTION
There are many types of assessments for literacy, and sometimes the terminology used to describe them can be confusing. Successful tiered instruction depends upon understanding and appropriately using a variety of assessment types. No single assessment can provide a reliable measure of student performance, and no single assessment can gauge all areas of literacy. A comprehensive assessment process assumes the use of a variety of assessment types and tools used to measure both basic and disciplinary literacy. Screening assessments are administered prior to instruction. Formative assessments (and diagnostic assessments) are administered during instruction for the purposes of
Adolescent Literacy Facilitator’s Guide
guiding instruction. Summative (achievement) assessments are administered after an instructional period.
All educators can develop their skills to think as assessors as well as course designers. Consciously asking questions of themselves before, during, and after instruction can help guide assessment practices. In addition, part of assessment design should be planning to include students in the process.
ELL Connections:
Formative assessment is important to inform teachers of their ELLs’ language acquisition progress. It is well established in the second language acquisition literature that language learners, no matter their first-language background, build their grammatical systems in predictable sequences
(Corder, 1967 ; Krashen, 1977). Without attention to these stages through regular assessment, these students can be at risk for being classified as language impaired or learning disabled. And without assessment of students’ knowledge prior to instruction, educators may risk misunderstanding the background knowledge their ELLs bring from their varied experiences.
BEFORE THE SESSION
Read Cooper, J. D. (1997). Literacy: Helping children construct meaning,
3 rd edition. Boston: Houghton Mifflin Company. Pages 516-518 available through North Central Regional Educational Laboratory at http://www.ncrel.org/sdrs/areas/issues/content/cntareas/reading/li7lk5.htm
Read Principles of Literacy Assessment
Read Massachusetts Department of Elementary and Secondary Education.
(2010). Guidelines for developing an effective district literacy action plan,
version 1.0. Malden, MA: Massachusetts Department of Elementary and
Secondary Education. Read pp. 2-7 (What IS a Strategic District Literacy
Action Plan?) and pp. 23-24 (Systemic Data Use)
Read Laboratory Network Project. A tool kit for professional developers:
Alternative assessment. Article retrieved 4/9/10 from http://www.ncrel.org/sdrs/areas/issues/methods/assment/as5relia.htm
Read “Reliability, Validity, and Fairness of Classroom Assessments”
Unit 1: Session 2
Module 3: Assessment Page 45
Unit 1: Session 2
Page 46
Read Chappuis, S., Chappuis, J., & Stiggins, R. (2009). The Quest for
Quality. Educational Leadership, 67(3), 14-19. http://www.ascd.org/publications/educational_leadership/nov09/vol67/n um03/The_Quest_for_Quality.aspx
Consider Reading Torgeson, J. K., & Miller, D. H. (2009). Assessments to
guide adolescent literacy instruction. Portsmouth, NH: RMC Research
Corporation, Center on Instruction. www.centeroninstruction.org/files/Assessment%20Guide.pdf
Read pp. 57-64.
Consider Reading Carnegie Council on Advancing Adolescent Literacy.
(2010). Time to act: An agenda for advancing adolescent literacy for college and career success . New York, NY: Carnegie Corporation of New
York.
Read pp. 17-33 “The Keys to Successful Reform.”
Consider Reading Stiggins, R. (April 2008). A call for the development of balanced assessment systems. Portland, OR: ETS Assessment Training
Institute. Retrieved 3/2/10 from www.nmsa.org/portals/0/pdf/.../other.../AssessmentManifesto08.pdf
Read All This is also a required reading document for Unit 2.
Consider Reading Edutopia (?). Grant Wiggins: Defining assessment.
Retrieved 4/8/10 from http://www.edutopia.org/grant-wigginsassessment
Read Interview with Grant Wiggins
Print Participant readings for Session 3 to hand out at the end of the session.
Print Participant’s Resource Packet Unit 1, Session 2 to hand out at the session.
Print Sample Screening Maze Passage Handout (Included in the
Facilitator’s Guide)
Procure Materials for Session 2 (see Materials below).
Adolescent Literacy Facilitator’s Guide
Participants
Read Cooper, J. D. (1997). Literacy: Helping children construct meaning, 3 rd
edition. Boston: Houghton Mifflin Company. Pages 516-518 available through
North Central Regional Educational Laboratory at http://www.ncrel.org/sdrs/areas/issues/content/cntareas/reading/li7lk5.htm
Read Principles of Literacy Assessment
Read Massachusetts Department of Elementary and Secondary Education.
(2010). Guidelines for developing an effective district literacy action plan, version
1.0. Malden, MA: Massachusetts Department of Elementary and Secondary
Education. Read pp. 2-7 (What IS a Strategic District Literacy Action Plan?) and pp. 23-24 (Systemic Data Use) *These pages are in the Participant’s Resource
Packet
Read Laboratory Network Project. A tool kit for professional developers:
Alternative assessment. Article retrieved 4/9/10 from http://www.ncrel.org/sdrs/areas/issues/methods/assment/as5relia.htm
Read “Reliability, Validity, and Fairness of Classroom Assessments”
Read Chappuis, S., Chappuis, J., & Stiggins, R. (2009). The Quest for Quality.
Educational Leadership, 67(3), 14-19. http://www.ascd.org/publications/educational_leadership/nov09/vol67/num03
/The_Quest_for_Quality.aspx
Recommended Reading Stiggins, R. (April 2008). A call for the development of balanced assessment systems. Portland, OR: ETS Assessment Training Institute.
Retrieved 3/2/10 from www.nmsa.org/portals/0/pdf/.../other.../AssessmentManifesto08.pdf
Read All. This is also a required reading document for Unit 2.
Index cards for questions and activities, at least 12 per participant
DURING THE SESSION
Take time to ask participants to share what activity/activities they chose to do and what their experience was.
Module 3: Assessment
Unit 1: Session 2
Page 47
Unit 1: Session 2
Page 48
The three questions may be put onto index cards and turned in to the facilitator to be addressed later in the session.
1.
Ask participants to reflect on the question, “What are two or three aspects of assessment that you’ve been thinking about since last session?” Have them turn to a partner to share their thoughts.
2.
Ask them to write down on index cards three questions they have about assessment at this moment.
3.
Finally, take a few minutes as a group and invite people to share any insights they gained from the follow-up activities from the last session.
The opening slides, starting with slide 4, provide an overview of the differences between basic literacy skills and disciplinary literacy skills. Participants who engaged in study of Module 2 will be familiar with these terms, but if they do not possess this background, it is important to distinguish between the two so that all participants understand that teaching literacy skills means a commitment to overall literacy development—but not every teacher need to everything (it must be a team approach).
While the purpose of most academic assessment is to ensure that curriculum and instruction are bridging the gaps between “what the student knows/does now” and “what the student needs to know/do.” It is important to note, however, that although students who struggle with basic literacy skills must receive remediation and be monitored through ongoing assessment at that level, they can and should still be instructed and assessed on disciplinary literacy skills (using texts, etc. that are at their decoding and comprehension level). For example, a student who is not fluent in arithmetic (basic calculation/math facts) should work on those, but ALSO be challenged to develop his or her mathematical thinking. Accommodations such as calculators and extended time can allow students with arithmetic challenges to progress at higher levels of thinking.
1.
Participants have reviewed pp. 2-7 and pp. 23-24 of the DESE Guidelines for Developing an Effective District Literacy Action Plan, version 1.0.
Some districts have a plan, others are in process, and others have not
Adolescent Literacy Facilitator’s Guide
begun. Ask participants to get into groups and discuss what they know about their district action plan.
2.
Distribute the Assessment Knowledge/Confidence Survey (In
Participant’s Resource Packet) and ask participants to take it. Surveys may be collected in order to assess participants’ current levels of knowledge and skill, and provide suggestions for further reading and study.
The slides in this portion of the session highlight for participants the basic elements of a balanced assessment program: screening, progress monitoring, achievement, and diagnostic.
It is important to clarify for participants that there are formal and informal versions of each of these categories of assessment. These are briefly reviewed below:
Screening
Formal screening is done schoolwide for the purposes of recognizing which students may be in need of interventions to improve their skills. Students identified through this process are placed in tiered instruction, and their progress is carefully monitored using a formal system such as that described in
Unit 3 of this Module.
Informal screening may also be done by classroom teachers for the purpose of gathering data about students’ current levels of knowledge and skill. A classroom teacher may ask students to write for 10 minutes in response to a prompt, for example, in order to assess time-constrained writing production levels, organization of written output, applications of writing conventions, etc. A classroom teacher might also administer an informal reading screening (such as the example MAZE passage that participants will take in this session) for the purpose of assessing their students’ reading fluency. A classroom teacher may also administer informal screenings for vocabulary knowledge, or content knowledge as well. Many classroom teachers do this under the name of “pretesting” and find it helpful to demonstrate to students how their skills have developed and their knowledge increased as a result of their work in class.
The purpose of screening at either level is to inform instruction. Formal screening points toward needed changes or additions to a student’s program.
Informal screening at the classroom level helps guide teachers to differentiating length of assignments, types of text, how much time to spend on particular content or skills, etc.
Module 3: Assessment
Unit 1: Session 2
Page 49
Unit 1: Session 2
Page 50
Progress Monitoring
Like screening, progress monitoring may be formal, such as the monitoring done of student’s reading skills to ensure that they are responding to interventions
(making effective progress) in tiered instruction. This type of progress monitoring is described in detail in Unit 3 of this Module.
Progress monitoring may also be done at the classroom level. Classroom teachers have a multitude of options for measuring their students’ progress in class, from the structured model of curriculum-based measurement (See Deno,
2003) to portfolio assessment. Frequent progress monitoring is an essential component of effective teaching.
Achievement
Achievement assessments (Also called “summative assessments”) measure learning AFTER it has occurred, and may be formal or informal. Examples of formal achievement assessments include MCAS, NWEA, Stanford Achievement
Tests, PSATs and SATs, AP examinations, etc. Informal achievement assessments in the classroom are those assessments used for the purpose of grading or ranking students—examples include quizzes, tests, essays, projects, presentations, etc.
Diagnostic Assessment
Diagnostic Assessments can also be formal or informal. Formal diagnostic assessment includes a myriad of tests administered as part of an educational evaluation, a neuropsychological evaluation, a speech-language evaluation, etc., as well as formal tests administered by specialists within the school to gather data about particular areas of a student’s difficulty in order to make better decisions about instructional programs or placement, and in order to inform the team involved in crafting an IEP.
When classroom teachers engage with individual students to figure out particular areas of strengths and difficulty, they are engaging in a form of diagnostic assessment. A teacher might observe a student, gather work samples, and meet with him or her about math performance, for example.
Using work examples and observations, the teacher and student may figure out that the student understands the concepts and can complete the calculations, but performs poorly on the achievement tests due to lack of time, or as a result of anxiety. Or, in another example, the teacher and student may determine that the student understands the mathematical concepts involved, but due to weak arithmetic skills, consistently gets incorrect answers.
Adolescent Literacy Facilitator’s Guide
Ask participants to take a sample Maze Passage screening. The passage and instructions are included in the Facilitator’s Guide. Tell participants that these screenings are examples of curriculum-based measurement that have been constructed based on a reading related this Module, following guidelines from
Maze passages can be used in literacy interventions to establish baseline scores and monitor progress. These can also be created by the classroom teacher for informal assessment of reading comprehension with the purpose of guiding classroom instruction (e.g., shortening reading assignments, providing preview materials, re-teaching main ideas, etc.)
1.
Explain the instructions, then pass out the STUDENT COPY of the passage. Participants should keep the sheet face-down until told to begin.
2.
Tell them to start, and run a timer for 3 minutes, then ask them to stop and turn over their sheets.
3.
Pass out the answer key and ask participants to score themselves and find out where they fall on the Florida proficiency chart that is included in their Participant’s Resource Packet.
4.
Share with participants that maze passages are often part of school wide screening, and student scores may be compared to the risk-level charts published by the Florida Institute for Reading Research (See The
Maze Passage Risk Level Charts at http://www.fcrr.org/assessmentMiddleHighSchool.shtm
). In a “real” screening, participants would take two (2) passage assessments, and their scores would be averaged. Maze passages can also be useful to content area teachers interested in screening and monitoring the comprehension progress of the students in their courses in order to target their instruction to student strengths and needs.
5.
After they have completed the activity, as a group they can discuss how they might use measures like these to establish baseline performances and to gain data about their students’ current skill and knowledge level in their classes. Curriculum-based measurement is addressed in further detail in Unit 3 of this Module.
1.
Based on reading “The Quest for Quality”
2.
Divide participants into 5 groups, each of which will focus on one “Key to Quality.” Ask each group to select a “reporter” who will take notes and summarize the group’s discussion to the whole class at the end of the activity.
Module 3: Assessment
Unit 1: Session 2
Page 51
Unit 1: Session 2
Page 52
3.
Each group will discuss: a.
In relation to your group’s “key,” what do you see as the strengths and needs of the assessment activities in your classes/schools/districts?
4.
Bring the whole group back together, and ask the reporters to share the highlights of their group’s discussion.
5.
Take a few moments to ask for further comments on the article.
These and subsequent slides ask participants to expand their thinking to include a variety of well-designed assessments as part of their curricular/instructional design, and to include students in that process. Ask participants to look over the
Thinking Like An Assessor handout, and comment on:
In what ways are these types of thinking different?
What makes each type of thinking important to good teaching?
1.
Ask participants to brainstorm the types of questions they ask themselves a.
Before they teach b.
While they are teaching c.
After teaching
2.
They should write each question on an index card.
3.
Ask them to get into small groups (3 or 4) and sort the questions into those that lead to formative assessments and those that lead to summative assessments.
4.
If there is enough time, ask participants to generate ideas about what forms of assessment they might use that would provide data to answer the question. This could also be assigned as an activity after the session.
Some sample questions are listed below. This activity can spur a very useful discussion.
How long will it take my students to do this assignment?
What is the best way to present the material?
Adolescent Literacy Facilitator’s Guide
Did the student meet the standard I set?
How does this student’s performance compare to others?
How much background do my students have in this topic?
Do I need to re-teach or review the skill?
Why don’t some students participate in class?
1.
Ask participants to recall the main points from the readings, “Reliability,
Validity, and Fairness of Classroom Assessments,” and “Principles of
Effective Literacy Assessment.” a.
The reading highlighted three important elements to keep in mind when designing assessments: validity, reliability, and fairness. Ask for three participants to volunteer to recall to the group what each of these is, and an example. b.
The reading highlighted 8 principles of effective assessment. Ask for 8 volunteers to take turns recalling to the group one of the principles and an example of it.
This topic will be addressed in much further detail in Unit 2, but is important to emphasize here as an essential element of good assessment practice.
Ask participants to take out the examples of assessments they use in their work with students. Ask them to pair up and share them with their partner, taking turns being the “interviewer.” The interviewer asks the teacher the following questions about one of the assessments, and notes their answers on the
Interview Questions handout:
1.
What is the purpose of this assessment?
2.
How are students included in this assessment process?
3.
How do you use the data gained from this assessment?
4.
Is the assessment valid? Is it reliable? How do you know?
5.
How might you enhance this assessment?
Module 3: Assessment
Unit 1: Session 2
Page 53
Unit 1: Session 2
Page 54
AFTER THE SESSION (FOR NEXT TIME…)
Engage in an activity that includes students in the assessment process in some way.
See suggestions for activities in the Participant’s Resource Packet:
Standards discussion
Discussion of models
Test Analysis
Prior to the next test or assignment (e.g., notes, a short essay) for your students, take some class time to talk with them about the standards/learning targets that the assessment is assessing, and provide them with a model of a proficient assignment and a poor one (from a previous year’s class and with the name deleted). Ask students to analyze the differences between the two as you guide the discussion and add your own comments. Save these notes (students may want to take notes as well) and use them as the basis for creating a test/assignment “reflection” for students to complete after the test or assignment. This may provide the basis for beginning to include students in the creation of criteria/rubrics for classroom assignments, one step toward including students in the assessment process.
REFERENCES
Carnegie Council on Advancing Adolescent Literacy. (2010). Time to act: An agenda for advancing adolescent literacy for college and career success .
New York, NY: Carnegie Corporation of New York.
Carnegie Mellon University. Assessment basics. http://www.cmu.edu/teaching/assessment/howto/basics/objectives.ht
ml
Chappuis, S., Chappuis, J., & Stiggins, R. (2009). The Quest for Quality.
Educational Leadership, 67(3), 14-19. http://www.ascd.org/publications/educational_leadership/nov09/vol67
/num03/The_Quest_for_Quality.aspx
Adolescent Literacy Facilitator’s Guide
Massachusetts Department of Elementary and Secondary Education. (2010).
Guidelines for developing an effective district literacy action plan,
version 1.0. Malden, MA: Massachusetts Department of Elementary and
Secondary Education.
Maze passage text from An Introduction to Curriculum Based
Measurement/Curriculum Based Assessment, http://www.specialconnections.ku.edu/cgibin/cgiwrap/specconn/main.php?cat=assessment§ion=cbm/main
McManus, S. coordinator. (2008). Attributes of effective formative assessment.
Washington, DC: Council of Chief State School Officers.
Newhall, P. (2008). Study skills: Research-based teaching strategies. Prides
Crossing, MA: Landmark School, Inc., pp. 70-71.
Thinking like an assessor. www.academicesl.com/docs/09THINKING_LIKE_AN_ASSESSOR.pdf
(apparently adapted from Wiggins & McTighe (2005), but there is no source information on this pdf, nor is it linked to a website).
ADDITIONAL RESOURCES
Carnegie Mellon University Assessment Basics http://www.cmu.edu/teaching/assessment/howto/basics/formativesummative.html
Carnegie Mellon University The Whys and Hows of Assessment http://www.cmu.edu/teaching/assessment/howto/basics/formativesummative.html
Wiggins, G. and McTighe, J. (2005) Understanding by design, expanded 2 nd
edition. ASCD. Chapter summaries available at http://www.ascd.org/publications/books/103055.aspx
ELL REFERENCES
Corder, S. (1967). The significance of learner's errors. International Review of
Applied Linguistics in Language Teaching, 5, 161-169.
Krashen, S. (1977). Some issues relating to the monitor model. On Tesol, 77,
144-158.
Recommended resources for instructor and participants:
Valdez Pierce, L. (2002). Performance-based assessment: Promoting achievement for English language learners. ERIC News Bulletin, 26 (1). http://www.cal.org/resources/archive/news/2002fall/CLLNewsBulletin_
Fa02c.pdf
Module 3: Assessment
Unit 1: Session 2
Page 55
Unit 1: Session 2
ELL ADDITIONAL RESOURCES
Gomez, E. (2000). Assessment portfolios: Including English language learners in large-scale assessments. Center for Applied Linguistics. EDO-FL-00-10. http://www.cal.org/resources/digest/0010assessment.html
1.
See instructions under Session 2 Activity: Sample Screening Assessment.
2.
Explain that the person being screened will receive a reading passage in which every 7 th word has been deleted, and replaced by a choice of the correct word and two distracter words. He or she will be given 3 minutes to read the passage and circle or underline the word that best fits the sense of the sentence. Scores are based on the number of correct answers provided in the given time.
3.
The Maze Risk Level Chart below (and in the Participant’s Resource
Packet) is used by the Florida Center for Reading Research to identify students at risk for poor performance. The number of correct responses indicates whether the student is high risk, medium risk, or low risk (at grade level). This and other resources are available at http://www.fcrr.org/assessmentMiddleHighSchool.shtm
4.
NOTE: As you discuss the risk factor charts, be cognizant that not all participants may be fluent readers themselves.
Page 56
Florida Center for Reading Research
July 2006 www.fcrr.org
Adolescent Literacy Facilitator’s Guide
Unit 1: Session 2
Module 3: Assessment Page 57
Unit 1: Session 2
Page 58 Adolescent Literacy Facilitator’s Guide
GUIDING QUESTION/OBJECTIVE
What role does assessment play in decision making related to improving adolescent literacy?
In what ways can we expand our understanding of the purposes and uses of assessment beyond tests and quizzes?
Assessment stakeholders: Who needs what data?
What are the critical foundations of useful assessment?
What are the roles and responsibilities related to implementing assessment systems?
To understand that different types of assessment activities provide data for different purposes and different audiences
To practice articulating a goal and breaking it down into learning targets
To become aware of who is responsible for what assessment activities in participants’ school or district
INTRODUCTION
There are many important elements of planning for and implementing a useful and balanced assessment program within the classroom and at the school and district levels. All stakeholders begin with the same goal of ensuring that all students reach language proficiency, and that already language-proficient students are challenged to meet higher levels of expectation. In spite of this shared goal, each stakeholder has a different need from assessment; a
Module 3: Assessment
Unit 1: Session 3
Page 59
Unit 1: Session 3
Page 60 comprehensive assessment program will ensure that each stakeholder has timely access to the meaningful data that will guide next steps in program design, teaching, and learning. An important step in the process of gathering and communicating meaningful data is formulating a clear set of proficiency standards, and then breaking these standards into learning goals and learning targets that lead toward the goals.
In addition to this challenge, a comprehensive program will also define clear roles and responsibilities for the design, selection, administration, scoring, interpretation, communication, and use of the assessment data.
This session introduces these challenges and asks participants to consider (or
“assess”) what teachers and schools can do to develop better capacity for a comprehensive and balanced assessment program.
BEFORE THE SESSION
Read National Association of Secondary School Principals. (2009). Putting assessment in the driver’s seat. Extracted with permission from http://www.carnegie.org/literacy/pdf/Culture_of_Literacy.pdf
and retrieved 3/10/10 from http://www.adlit.org/article/31350
Read Morsy, L., Kieffer, M., and Snow, C.E. (2010).
Measure for measure:
A critical consumers' guide to reading comprehension assessments for adolescents . New York, NY: Carnegie Corporation of New York. Excerpt retrieved 3/10/10 from http://www.adlit.org/article/34652 Read “What
Should an Assessment System Look Like?”
Review Massachusetts Department of Elementary and Secondary
Education. (2010). Guidelines for developing an effective district literacy
action plan, version 1.0. Malden, MA: Massachusetts Department of
Elementary and Secondary Education. Examine p. 94, “Massachusetts
Secondary Literacy Framework”
Also Consider Reading:
Ronka, D., Lachat, M. A., Slaughter, R., & Meltzer, J. (December
2008/January 2009). Answering the questions that count. Retrieved
3/31/10 from www.ascd.org/.../Answering_the_Questions_That_Count.aspx
Adolescent Literacy Facilitator’s Guide
Hamilton, L., Halverson, R., Jackso;n, S., Mandinach, E., Supovitz, J., &
Wayman, J. (2009). Using student achievement data to guide instructional decision making (NCEE 2009-4067). Washington, DC:
National Center for Education Evaluation and Regional Assistance,
Institute of Education Sciences, U.S. Department of Education.
Retrieved from http://ies.ed.gov/ncee/wwc/publications/practiceguides/ . Especially pp. 9-26 which address assessment at the classroom level.
Levin, H. M.; Catlin, D.; Elson, A. (2009). Adolescent literacy programs:
Costs of implementation. New York: Carnegie Corporation. Retrieved
4/12/10 from carnegie.org/fileadmin/Media/Publications/PDF/tta_Levin.pdf
National Association of Secondary School Principals. (2005). Creating a
culture of literacy: A guide for middle and high school principals.
Retrieved 4/11/10 from http://www.adlit.org/article/23373
Silva, E. (November 2008). Measuring skills for the 21 st century.
Washington, DC: Education Sector. Retrieved from http://www.educationsector.org/research/research_show.htm?doc_i
d=716323 (p. 2).
Print Participant’s Resource Packet, Unit 1, Session 3
Unit 1: Session 3
Module 3: Assessment Page 61
Unit 1: Session 3
Participants
Read National Association of Secondary School Principals. (2009). Putting assessment in the driver’s seat. Extracted with permission from http://www.carnegie.org/literacy/pdf/Culture_of_Literacy.pdf
and retrieved
3/10/10 from http://www.adlit.org/article/31350
Read Morsy, L., Kieffer, M., and Snow, C.E. (2010).
Measure for measure: A critical consumers' guide to reading comprehension assessments for adolescents.
New York, NY: Carnegie Corporation of New York. Excerpt retrieved
3/10/10 from http://www.adlit.org/article/34652 Read “What Should an
Assessment System Look Like?”
Review Massachusetts Department of Elementary and Secondary Education.
(2010). Guidelines for developing an effective district literacy action plan, version
1.0. Malden, MA: Massachusetts Department of Elementary and Secondary
Education. Examine p. 94, “Massachusetts Secondary Literacy Framework” (In
Participant’s Resource Packet).
Note to Facilitators:
The last part of this session appears in a slightly different form in Unit 3; however, as Unit 3 is targeted specifically at those working with students at risk, this part was included in this session as well in order to encourage educators of ALL students to be thinking about assessment programs.
If this part of the session is presented, it should be focused on members of individual schools examining the existing assessment system and its associated roles and responsibilities (for basic, intermediate, and disciplinary literacy), sharing thinking about steps to improve the system, and clarifying the roles and responsibilities for ensuring that assessments are administered, interpreted, and used to enhance student outcomes.
Page 62 Adolescent Literacy Facilitator’s Guide
DURING THE SESSION
Since last session, how have you worked at including students in the assessment process?
Collect into groups of 4 and share with each other what the activity was, what went well, and how you might revise it in the future.
Take five minutes to write ideas about the next steps you would like to take to include your students in the assessment process.
These goals include the skills students need as well as an emphasis on what students can do with those skills. Every goal has multiple “sub-goals” and objectives or target skills. It is beyond the scope of this presentation to present all of these. Landmark School Outreach Program
(
) has an outstanding book focused on expressive language that highlight objectives and offer teaching strategies: Roberta Stacy’s
Thinking About Language (see references).
It is important to emphasize the dual purpose of assessment—not only does assessment inform instructional decision-making at each level, but it also relates to students’ motivation to learn, especially when data is communicated in a fashion that answers students’ questions about their current level of skill/knowledge, guides them toward next steps for learning, and demonstrates their progress and success.
Asking good questions (about what type of data is needed for what purpose) guides the choice of assessment tools and how they are used. While institutions and programs need achievement data that can be reported both individually and in aggregate, teachers and students in classes need formative data—data that reveals how the individual student is progressing toward successful achievement of proficiency in set standards. This type of data requires
Module 3: Assessment
Unit 1: Session 3
Page 63
Unit 1: Session 3
Page 64 assessments that are directly aligned with curriculum, that provide specific data quickly, and that provide data that enables both teachers and students to gain a clear picture of areas of strength and need in order to inform the next steps in teaching and learning.
Notice in the chart on slide 7 that for the student and teacher categories, there are action questions/decisions that must be addressed. These can be addressed through formative assessment. For parents and administrators, the questions are more generally focused on outcomes, which can be addressed through summative assessment. Obviously, the latter have decisions to make as well, especially if the outcomes show lack of proficiency, but the point is that the types of assessment that are useful to teachers and students during instruction are different from those that are useful to parents and administrators.
Another level of stakeholder that Stiggins (2008) discusses is at the policy level including district, community, and state leaders who are focused on accountability.
These three slides, slides 8
10, adapted from the 2009 Massachusetts
Department of Elementary and Secondary Education’s Second Annual
Curriculum, Instruction, and Assessment Summit, overview for participants the decisions, responsibilities, and data needed at the institutional, program, and classroom levels.
In this activity, participants look at the MA Secondary Literacy Framework which sets standards toward which teachers and students are working. These standards also need to be communicated clearly to teachers and students, and to be parsed into measureable learning targets in order for progress to be measured (and to guide instruction) along the way toward achievement.
Note: Because an essential element of good assessment is for all stakeholders to have a clear understanding of standards, participants may raise the question of whether MA standards are clearly communicated to all stakeholders. There are a variety of documents, from the frameworks, to the common core of learning, to the newly released Common Core State Standards, to the literacy framework participants will examine here. All articulate standards slightly differently and format. In order to avoid confusion, participants will focus on the Massachusetts
Secondary Literacy Framework.
Adolescent Literacy Facilitator’s Guide
1.
Ask participants to review the Goal boxes on the chart from the
DESE, “Massachusetts Secondary Literacy Framework.”
2.
Divide the group into small groups to facilitate discussion of the following: a.
How are the learning targets for each goal articulated to students and teachers? b.
How is progress toward achievement of each goal measured
(e.g., what type of assessment is used?) c.
In what ways does each assessment answer (or not) stakeholder questions?
3.
Bring the whole group back together, asking each group to report in turn on their responses to a-c. This discussion can point schools toward goals for improving their assessment programs to ensure that a variety of assessments will provide the needed data to all stakeholders.
Using these slides, slides 12
14, summarize the critical foundations of assessment. While this Module is focused on assessment as a discrete activity, it should be emphasized continually that useful and balanced assessment should be seamlessly interwoven with curriculum and instruction.
In order to measure progress and achievement, we must articulate what counts as proficiency, and what are the learning targets along the way. This activity can enhance classroom teachers’ skill at developing valid assessments that reflect the progression of learning targets on the way to proficiency standards.
As there are many literacy goals/standards reflected in the MA Secondary
Literacy Framework, the MA ELA Frameworks, and the Common Core of
Learning, this activity can be extended beyond class by encouraging literacy teams/schools to create learning and skills progressions for each literacy goal.
This particular activity is meant to encourage classroom teachers to parse their own classroom-based overarching goals into learning targets, using this exercise as an example.
1.
Ask participants to return to the groups they formed for the last activity.
2.
Using the handouts provided in the Participant’s Resource Packet
(which include an example of breaking a literacy goal into its component learning targets), work in small groups to practice
Module 3: Assessment
Unit 1: Session 3
Page 65
Unit 1: Session 3
Page 66 breaking overarching goals into learning targets that are articulated in measureable terms.
Slide 17 asks participants to think about where their own assessment activities fit in to the larger picture of assessment, and to gain awareness of the roles and responsibilities for assessment activities in their own schools/districts.
Break into two groups.
Group 1 should underline/note the places that require school policy related to assessment
Group 2 should underline/note the places that assume that personnel have the appropriate background/training to fulfill their responsibilities
Discuss what is needed in terms of policy and professional development in your school/district.
In order to begin building a useful and comprehensive assessment system, all stakeholders in assessment should possess a clear understanding of their roles and responsibilities. All schools have strengths and needs in the areas of professional collaboration toward goals (see (a) in the activity below), the identification/creation and timely administration of high quality assessments
(b), professional development/training that enables clear interpretation of assessment data (c), and communication of data in a meaningful way to all stakeholders (d).
1.
Break the group into smaller groups (ideally by grade level).
2.
Use the Roles and Responsibilities Sheets provided in the Participant’s
Resource Packet to identify some of the strengths and needs in the assessment system at your school/in your district regarding the following:
Adolescent Literacy Facilitator’s Guide
a) Educators working together to meet individual student needs b) Useful assessments that provide needed data c) Systematic interpretation that informs instruction d) Communication of goals and progress to all stakeholders
AFTER THE SESSION (FOR NEXT TIME…)
Revisit Your goals for your class(es).
Create A list of learning targets/objectives that will move students toward the goal.
Evaluate the assessments you use:
Are the learning targets clear to students so that they can understand where they are on the way to the goal?
Do the assessments measure progress on the learning targets?
REFERENCES
Afflerbach, P. (2008) “Meaningful assessment of struggling adolescent readers” pp. 249-264 in Reading success for struggling adolescent readers, Eds.
Lenski, S. and Lewis, J. New York: The Guilford Pres.)
Brenner, D., Pearson, P. D., & Rief, L. (2007). “Thinking through assessment.” Pp.
257-272. In Adolescent literacy: Turning promise into practice, Beers, K.,
Probst, R. E., & Rief, L., eds.. Portsmouth, NH: Heinemann. 2007.
LeGeros, L., O’Brien, G., & Raha, M. (November 3 and 4, 2009) Implementing a
balanced assessment system. Presentation at Second Annual
Curriculum, Instruction, and Assessment Summit. Marlborough, MA:
Massachusetts Department of Elementary and Secondary Education.
Levin, H. M.; Catlin, D.; Elson, A. (2009). Adolescent literacy programs: Costs of
implementation. New York: Carnegie Corporation.
Morsey, L., Kieffer, M., & Snow, C. E. (2010). Measure for measure: A critical consumers’ guide to reading comprehension assessments for
adolescents. New York: Carnegie Corporation of New York.
Module 3: Assessment
Unit 1: Session 3
Page 67
Unit 1: Session 3
Silva, E. (November 2008). Measuring skills for the 21 st century. Washington, DC:
Education Sector. Retrieved from http://www.educationsector.org/research/research_show.htm?doc_id=
716323
Stacey, R. (2003). Thinking about language: Helping students say what they
mean and mean what they say. Prides Crossing, MA: Landmark School,
Inc.
Stiggins, R. (April 2008). Assessment manifesto: A call for the development of balanced assessment systems. Portland, OR: ETS Assessment Training
Institute, p. 3.
Torgeson, J. K., & Miller, D. H. (2009). Assessments to guide adolescent literacy instruction. Portsmouth, NH: RMC Research Corporation, Center on
Instruction.
ADDITIONAL RESOURCES
Berends, M., Kirby, S. N., Naftel, S., & McKelvey, C. (2001). Implementation and
performance in new American Schools. Santa Monica, CA: Rand
Education.
Denti, L. and Geurin, G. R. (2007). Effective practice for adolescents with reading
and literacy challenges. New York, NY: Routlege.
Page 68 Adolescent Literacy Facilitator’s Guide
Unit 2 is appropriate for any secondary educator (4 th –12 th ) interested in developing or enhancing their assessment practices to better guide their instruction, and to more effectively include their students in the learning and assessment process.
INTRODUCTION
Classroom assessment can take many forms. This unit invites participants to examine their assessment practices through the lens of formative assessment for learning, a process that aims at bridging achievement gaps by including students in the assessment process, and engaging in assessment activities that evaluate the effectiveness of teaching and learning while it is occurring. Unit 2 is comprised of three sessions.
Session 1 builds upon Unit 1 of this Module, further introducing formative assessment design, and inviting participants to consider the language skills of literacy—listening, speaking, reading, and writing. Participants are encouraged to differentiate assessment of knowledge from assessment of skill, and to identify ways to include students in the assessment process.
Session 2 delves further into formative assessment, distinguishing “pure” formative assessment for learning from the formative use of summative assessments. In this session, participants identify new approaches to assessment they can try in their classrooms.
Session 3 invites participants to view assessment through the students’ eyes, and to engage in discussion of student-designed rubrics, self-assessment, peerassessment, and portfolio assessment.
Unit 2: Overview
Module 3: Assessment Page 69
Unit 2 Overview
ELL Connections:
Like their English-only (EO) peers, English language learners’ (ELL) language proficiency is foundational for literacy acquisition. “Available research suggests that English oral language proficiency is consistently implicated when larger chunks of text are involved, whether in reading comprehension or writing” (Geva, 2006, p. 139).
Therefore, attention to ELLs’ oral language development, particularly cognitive academic language proficiency
(CALP) skills (Cummins, 1980) is a crucial assessment so as to monitor language development and inform literacy instruction. A brief but useful resource for educators interested in some of the characteristics of a learner’s
language development, see the DESE Massachusetts English Language Assessment – Oral (MELA-O) rubric at the end of the English Language Proficiency Benchmarks and Outcomes for English language learners (2003): http://www.doe.mass.edu/ell/benchmark.pdf
. Please note that to use this assessment, educators have to be trained and establish inter-rater reliability with the DESE. There should be a faculty member on your staff who has had this training and conducts this assessment of the school’s ELLs once per year. We are only recommending this document as a resource for an overview of oral language assessment of ELLs.
Page 70 Adolescent Literacy Facilitator’s Guide
GUIDING QUESTION/OBJECTIVE
What role does assessment play in decision-making related to improving adolescent literacy?
What can we do to enhance our assessment practices in the classroom?
In what ways do we need to add to our understanding of the purpose of assessment?
How can we design assessments that better measure disciplinary literacy?
What steps can we take to include our students in the assessment process?
To understand the foundational concepts related to formative assessment in the classroom
To generate and try out teaching ideas related to incorporating formative assessment in classroom teaching
INTRODUCTION
In order to enhance assessment practices that improve disciplinary literacy for adolescents, we need to be clear in our thinking about the responsibilities of schools, the purpose of assessment, and the use of assessment data. In order to participate in formative assessment in a meaningful way, the overall learning goals and pre-requisite sub-skills for meeting those goals, as well as definitions for what constitutes success and how that success will be recognized both during and after instruction, must be clear to both teachers and students.
Module 3: Assessment
Unit 2: Session 1
Page 71
Unit 2: Session 1
Page 72
BEFORE THE SESSION
Read Stiggins, R. (2008). Assessment manifesto: A call for the development of balanced assessment systems. ETS Assessment Training
Institute, Portland, OR. Accessed March 15, 2010 from http://www.ascd.org/publications/educational_leadership/dec08/vol66/n um04/Answering_the_Questions_That_Count.aspx
Read Carnegie Mellon University. Grading vs. assessment of learning outcomes: What’s the difference? http://www.cmu.edu/teaching/assessment/howto/basics/gradingassessment.html
Read Carnegie Council on Advancing Adolescent Literacy. (2010). Time to act: An agenda for advancing adolescent literacy for college and career success . New York, NY: Carnegie Corporation of New York. “Essential
Elements of Literacy for Adolescent Learners” pp. 72-79.
Review Black, P., & Wiliam, D. (1998). Inside the Black Box. Phi Delta
Kappan, 80(2), 139. Retrieved from
http://blog.discoveryeducation.com/assessment/files/2009/02/blackbox_ article.pdf
Consider Reading Silva, E. (November 2008). Measuring skills for the 2 1st
century. Washington, DC: Education Sector. Retrieved from http://www.educationsector.org/research/research_show.htm?doc_id=7
16323
Consider Reading Perie, M., Marion, S., Gong, B., & Wurtzel, J. (2007). The role of interim assessments in a comprehensive assessment system. The
Aspen Institute . www.achieve.org/files/TheRoleofInterimAssessments.pdf
Consider Reading Organisation for Economic Coopereation and
Development.
www.oecd.org/dataoecd/47/61/35070367.pdf
Adolescent Literacy Facilitator’s Guide
Consider Reading Wurtzel, J. (2009, June) “The role of interim assessments in a comprehensive assessment system.” In L. M. Pinkus
(Ed.), Meaningful measurement: The role of assessments in improving high
school education in the twenty-first century. Washington, DC: The Alliance for Excellent Education. Accessed March 15, 2010 from http://www.all4ed.org/files/MeaningfulMeasurement.pdf
pp. 77-94.
Print Copies of participant readings for Session 1 to distribute prior to the session.
Print Copies of participant readings for Session 2 to distribute at the end of the session.
Print Session 1 Participant’s Resource Packet to be distributed at the beginning of the session.
Procure Session 1 materials (see Materials below).
Ask Participants to bring with them several samples of their own classroom assessments (quizzes, homework assignments, tests, projects, etc.)
Unit 2: Session 1
Module 3: Assessment Page 73
Unit 2: Session 1
Page 74
Participants
Read Stiggins, R. (2008). Assessment manifesto: A call for the development of balanced assessment systems. ETS Assessment Training Institute, Portland, OR.
Accessed March 15, 2010 from
Read Carnegie Mellon University. Grading vs assessment of learning outcomes:
What’s the difference?
Read Carnegie Council on Advancing Adolescent Literacy. (2010).
.
New York, NY: Carnegie Corporation of New York.
“Essential Elements of
Literacy for Adolescent Learners” pp. 72-79.
Consider Reading Black, P., & Wiliam, D. (1998). Inside the Black Box. Phi Delta
Kappan, 80(2), 139. Retrieved from Ebsco Professional Development Database.
Available as an online article at www.blog.discoveryeducation.com/assessment/files/.../blackbox_article.pdf
Sticky notes pads (at least one for each participant)
2 poster boards, or wall space to display sticky notes
DURING THE SESSION
Have participants go through the Think/Talk/Write Process to recall their own experiences of assessment as students.
Think: Take a few moments to reflect on your testing experience as a middle and high school student. What good and bad experiences did you have?
Adolescent Literacy Facilitator’s Guide
Talk: Turn to your neighbor and share an example of a good experience with assessment and a bad experience with assessment.
Explain what made them so.
Write: If you could have changed three things about your experience of assessment when you were in school, what would they be?
Consider asking participants to share the changes they wrote about in the
“Write” section with the group as a whole. This activity often provides the foundation for a good conversation about students’ experiences of assessment.
These first several slides highlight important elements addressed in the reading
(Stiggins, 2008).
Use the Text-Rendering Experience
( http://www.nsrfharmony.org/protocol/learning_texts.html
). This protocol is included at the end of the Unit 2 Participant’s Resource Packet as well.
To facilitate discussion of what a balanced assessment system means for the classroom in light of the reading, “Assessment Manifesto: A Call for the
Development of Balanced Assessment Systems.”
1.
Work with a partner and exchange the sample assessments you brought with you to the session.
2.
What is it measuring? a.
Content area knowledge b.
Content area skill c.
Both content knowledge AND skill
3.
Discuss ways in which the assessments might be enhanced to provide clearer data about student strengths and needs in the subject.
It is always important to call attention to the fact that we mostly measure listening and reading comprehension through writing-based assignments (and less frequently through oral-based assignments). A student’s poor performance on
Module 3: Assessment
Unit 2: Session 1
Page 75
Unit 2: Session 1
Page 76 these may not be an indication of poor listening and reading, but rather an indication of difficulty with expressing ideas coherently in writing or speaking.
These slides, 11
16, address language-based goals, and encourage discussion of how a teacher might design assessments that measure the example skills, and the difference between grading and assessment for learning.
Choosing and using assessments that provide useful information necessitates an understanding of WHAT we want to assess. Every goal has multiple “sub-goals” and objectives or target skills. Though it is beyond the scope of this presentation to present all of these, two books focused on expressive language highlight goals and objectives and offer teaching strategies: Roberta Stacy’s Thinking
About Language, and Charles L. Haynes & Terrill Jennings’ From Talking to
Writing. See the publications page at www.landmarkoutreach.org.
Literacy skills consist of four areas of language skill: listening, speaking, reading, and writing. These skills develop interactively in relation to executive function.
Listening comprehension (part of the set of receptive language skills) underlies reading comprehension—fluent readers “translate” the written word into something approximating spoken language, which is why prosody in oral reading is a good indicator of comprehension.
For each of the slides on language-based goals, a discussion of how a teacher might assess the example skills can provide good thinking about assessment design that matches purpose. For example, if a teacher wants to assess students’ listening skills, asking them to write a summary would not necessarily be a valid assessment of listening—it would be an assessment of both listening and summarizing. Some students listen quite actively, but lack the skill to structure a summary, so they would do poorly on the assessment even though they had gained the knowledge from listening. A valid assessment of listening skill might be to present students with a written list of the main ideas and supporting examples and have them match. Awareness of these issues is important for both teachers and students if they are to gain an accurate picture of the students’ current disciplinary knowledge and skill so that both teachers and students know where they stand in relation to proficiency standards.
Learning goals are related to overall standards set by the school/district/state.
Most classroom teachers at the secondary level will raise the question of grades. This model provides one cogent response to how to use classroom assessments such as quizzes, tests, etc. formatively. Teachers are asked to shift
Adolescent Literacy Facilitator’s Guide
from grading different classroom activities with a single grade which is uniformative to either teachers or students, toward categorizing assessment questions/activities by the target knowledge and skill, then assigning a score to each. Analyzing scores in this way shows immediately which knowledge and skills need further instruction if the student is to reach the overall course goals.
Over time and multiple assessments, a clear picture of student learning outcomes which delineates strengths and needs across knowledge and skill goals emerges.
If there are enough participants from each discipline, they can form groups according to the content area of the assessment.
1.
Examine the sample assessments from different disciplines (those in the Participant’s Resource Packet as well as those participants brought with them to the session). It would be useful for those who brought sample assessments with them to trade off with others so that a “fresh eye” can look at the assessments. The sample assessments included in the Participant’s Resource Packet cover a span of grade levels, and are not all complete assessments (some are just a few questions from a longer assessment). The goal is to get participants thinking about how assessment questions relate to standards, as well as what knowledge and skill level is assumed in order for the student to perform successfully on these tasks.
2.
Work with a partner to: a.
Identify what proficiency standards the sample assessments/questions are assessing. For this part of the activity, remind the participants about the reading they did
(“Grading and Assessment: What’s the Difference).
Identifying the standards/learning targets is the first step toward designing assessments that validly measure progress toward learning, and learning outcomes. b.
Identify some of the pre-requisite sub-skills required to perform successfully on the assessment.
3.
Discuss: What are some ways this assessment might be used to guide future instruction a.
For students who performed poorly? b.
For student who performed proficiently?
Module 3: Assessment
Unit 2: Session 1
Page 77
Unit 2: Session 1
Page 78
The last slides in this session ask participants to begin thinking about students as part of the assessment process.
Participants can work individually, writing what they do on one color sticky note, and what they could do on another color sticky note. Then, notes can be put together onto large poster boards or an empty wall space. A summarizer can be assigned to each board to collect ideas into categories, then report to the group for list-making.
Think about what you do AND what you could do in your class to include students in the assessment process:
1.
At the goal setting level
2.
During instruction (formative assessment)
3.
After summative assessment
AFTER THE SESSION (FOR NEXT TIME…)
Try redesigning one of your current assessments following the ideas discussed in the Carnegie Mellon resource “Grading and Assessment:
What’s the Difference?” and come to the next session prepared to talk about what you did and how it went.
Take some time to write down your goals for one of your classes in terms of the knowledge and skills you expect students to achieve by the end of the year.
Take 10 minutes in one of your classes to ask your students to brainstorm what they think are the goals they are expected to achieve in your class.
Collect the brainstorms and read over them.
Adolescent Literacy Facilitator’s Guide
REFERENCES
Brenner, D., Pearson, P. D., & Rief, L. (2007). Thinking through assessment. In K.
Beers, R. E. Probst, & L. Rief (Eds.), Adolescent literacy: Turning promise
into practice (pp. 257-272). Portsmouth, NH: Heinemann.
Carnegie Mellon University. Grading vs. assessment of learning outcomes:
What’s the difference? Retrieved April 16, 2010 from http://www.cmu.edu/teaching/assessment/howto/basics/gradingassessment.html
Stiggins, R. (April 2008). Assessment manifesto: A call for the development of balanced assessment systems. Portland, OR: ETS Assessment Training
Institute.
Torgeson, J. K., & Miller, D. H. (2009). Assessments to guide adolescent literacy
instruction. Portsmouth, NH: RMC Research Corporation. Center on
Instruction.
ADDITIONAL RESOURCES
The Association for Achievement and Improvement through Assessment (May
2002) Secondary assessment practice: Self Evaluation and development materials. www.aaia.org.uk/pdf/Publications/finalbooklet.PDF
Schreyer Institute for Teaching Excellence http://cte.umdnj.edu/student_evaluation/evaluation_cat.cfm
Stacey, R. (2003). Thinking about language: Helping students say what they
mean and mean what they say. Prides Crossing, MA: Landmark School, Inc.
Unit 2: Session 1
Module 3: Assessment Page 79
Unit 2: Session 2
GUIDING QUESTION/OBJECTIVE
What role does assessment play in decision making related to improving adolescent literacy?
What can we do to enhance our assessment practices in the classroom?
What is formative assessment and why should we use it?
How can I enhance the formative assessment I already use?
To understand formative assessment and formative assessment for learning
To identify current formative practices and generate ideas for incorporating further formative practices in participants’ classes
INTRODUCTION
This session invites participants to understand formative assessment for learning as distinct from the broader term “formative assessment,” to examine ways in which these practices are used in schools, and to identify activities to incorporate into their teaching.
Page 80 Adolescent Literacy Facilitator’s Guide
BEFORE THE SESSION
Read Chappuis, J., Chappuis, S., & Stiggins, R. (2009). Formative assessment and assessment for learning. In L. M. Pinkus (Ed.), Meaningful measurement: The role of assessments in improving high school education
in the twenty-first century (pp. 55-76). Washington DC: Alliance for
Excellent Education. Available from http://www.all4ed.org/publication_material/reports/meaningfulmeasurement
Read Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2004).
Working Inside the Black Box: Assessment for Learning in the Classroom.
(cover story). Phi Delta Kappan, 86(1), 9-21. Retrieved from www.datause.cse.ucla.edu/DOCS/pb_wor_2004.pdf
Read McManus, S. (2008). Attributes of effective formative assessment.
Washington, DC: Council of Chief State School Officers. http://www.ccsso.org/publications/details.cfm?PublicationID=362
Read Wylie, C. E. (2008). Formative assessment: Examples of practice.
Washington, DC: Council of Chief State School Officers. (p. 3). http://www.ccsso.org/publications/details.cfm?PublicationID=363
Consider Reading Darling-Hammond, L. & Pecheone, R. (2009). Reframing accountability: using performance assessments to focus learning on higher-order skills. In Pinkus, L.M., ed. (2009, June) Meaningful measurement: The role of assessments in improving high school education
in the twenty-first century. Washington, DC: The Alliance for Excellent
Education. Accessed March 15, 2010 from http://www.all4ed.org/files/MeaningfulMeasurement.pdf
(pp. 25-53).
Consider Reading Fillmore, L. W. and Snow, C. E. (August 23, 2000). “What teachers need to know about language.” Center for Applied Linguistics.
Available in pdf at psu.edu by doing a search on the title.
Print Copies of participant readings for Session 3 to distribute at the end of the session.
Print Handout: Attributes of Formative Assessment (McManus, S. (2008). http://www.ccsso.org/publications/details.cfm?PublicationID=362 . This handout is not included in the Participant’s Resource Packet.
Module 3: Assessment
Unit 2: Session 2
Page 81
Unit 2: Session 2
Page 82
Print Session 2 Participant’s Resource Packet to be distributed at the beginning of Session 2
Participants
Read Chappuis, S., & Stiggins, R. (2009). “Formative assessment and assessment for learning.” in Pinkus, 2009, Meaningful measurement. http://www.all4ed.org/publication_material/reports/meaningfulmeasurement
Read Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2004). Working
Inside the Black Box: Assessment for Learning in the Classroom. (cover story).
Phi Delta Kappan, 86(1), 9-21. Retrieved from www.datause.cse.ucla.edu/DOCS/pb_wor_2004.pdf
Consider Reading McManus, S. (2008). Attributes of effective formative
assessment. Washington, DC: Council of Chief State School Officers. http://www.ccsso.org/publications/details.cfm?PublicationID=362
Consider Reading Wylie, C. E. (2008). Formative assessment: Examples of
practice. Washington, DC: Council of Chief State School Officers. http://www.ccsso.org/publications/details.cfm?PublicationID=363
DURING THE SESSION
Use a round-robin discussion to generate talk about classroom assessment activities since the last session.
Ask participants to spend 2-3 minutes describing one or more things they did in their classrooms since the last session, and any impact they noticed on their teaching experience, or their students’ learning experiences.
After one person has spoken, the next person should ask him or her a question about the description. The original speaker should answer.
The discussion then progresses around the table with each person having an opportunity to describe classroom activity and answer a question about it, and each person having an opportunity to ask the describer a question.
Adolescent Literacy Facilitator’s Guide
Review these terms. Note that the Chappius, Chappius, & Stiggins article distinguishes even further between formative assessment (as defined above) and formative assessment for learning. Any assessment that is used to guide actions that will improve learning can be considered formative. However, formative assessments for learning are more individualized to specific students, and the actions are more immediate. Examine the chart on p. 62 of the reading for further explanation.
Ask participants to spend 10 minutes working individually or in pairs to review the article with the goal of listing (or highlighting in the article) as many formative learning/teaching strategies as they can (at least 10). After 10 minutes, bring the group back together as a whole, and go around the table asking each person to say a strategy from their list, and asking others who do not have it on their list to add it. As the callouts proceed around the table, only strategies not mentioned already should be added.
These slides clarify the differences between formative assessment and formative assessment for learning.
The achievement gains reported in the studies of formative assessment were among the largest found for any educational intervention. The article is a review of international literature between 1988 and 1998. Studies were done on groups of students from elementary through college.
In the classroom, teachers use formal assessments (tests, quizzes, projects, assignments, performances, etc.) and informal assessments (questioning, dialogue, observation, anecdotal note-taking). In any of these instances, they may or may not be engaged in formative assessment.
The DESE lists classroom assessments (tests, etc.) as well as interim benchmark assessments in the “formative” category because sometimes these types of assessments can be used in formative ways. It is important to note, however, the point that Chappius, Chappius, and Stiggins make in the reading: Summative
Module 3: Assessment
Unit 2: Session 2
Page 83
Unit 2: Session 2
Page 84 assessments aren’t bad or wrong; they’re just not formative. They have a different purpose: to report out level of achievement. Mislabeling them as formative, or using summative assessment information in formative ways, will not generate the achievement gains realized in formative assessment research studies” (p. 58).
1.
Ask participants to complete the Confidence Survey in the
Participant’s Resource Packet.
2.
Divide the group into 5 groups, each of which will have the responsibility to discuss one of the five attributes of formative assessment (10-15 minutes). Ask each group to discuss and list a.
what aspects of their instruction currently reflect this attribute, and b.
what they see as challenges to have their classroom assessments more fully reflect that attribute.
3.
When the groups finish and return as a larger group, ask each group to report out.
4.
List the current practices and perceived challenges on a board or flipchart.
The goal of this activity is not to solve the challenges. Rather, it is to become aware of the standards and attributes of formative assessment for learning, reflect upon where we are in our current practice, and identify the challenges we can undertake to further progress toward the goal.
Discuss self-questioning as an essential strategy to improve thinking. Review the questions posed in the reading “Formative Assessment and Assessment for
Learning” with the goal of focusing on the questions that drive instruction and learning. Leave time for participants to add their own suggestions of questions to the lists.
State assessments (MCAS) have very limited formative use. Their purpose is summative so there is little data about individual student strengths and needs.
In addition, the results are delivered too late to inform instruction.
Interim/Benchmark assessments are often intended to be used formatively but are frequently administered and scores reported with no change to instruction related to individual students’ performances on them.
Adolescent Literacy Facilitator’s Guide
Graded classroom assessments (tests, quizzes, projects, etc.) can be readily adapted to formative use because the results are readily available, and the learning targets have been recently taught. As was described in Session 1, if teachers track the learning target for each assessment question, they can use the results to select and re-teach what the students have not mastered. Along the same lines, teachers can help students use their graded classroom assessments to improve their learning—if each assessment question or task is explicitly tied to a learning goal, students can analyze their results and be able to understand where they are on their way to proficiency at a given standard, and how much progress they have made, where they performed poorly and why, and what they need to do to improve.
For graded classroom assessments to be used formatively, teachers need to make time for students to relearn the knowledge and skills they didn’t demonstrate on the assessment, and then re-take the assessment. (Note: This will likely lead to a discussion about grades, grade inflation, and fairness. The role grades play in educational culture is beyond the scope of this Module in spite of its importance and the fact that it does act as an impediment to full use of formative assessment. At this point, it is important to ask participants to merely “think along” with you about what could happen in classrooms if the focus shifted from achievement grades for ranking to assessment as feedback on what next steps to take.)
Informal, ungraded classroom assessments (e.g., discussion, Q&A, observation of group and independent work) can be formative if the teachers and students use the information to inform their next steps. For example, in a classroom discussion that reveals misconceptions about the causes of the Civil War, the teacher stops the discussion and offers feedback to the students that they were focusing on one cause of the war, when in fact there were several, then points them toward their notes, or the text, tells them to review the causes and come to class the next day with questions, the discussion would be an example of formative assessment for learning. If the discussion ends without the teacher offering this feedback, and the next class takes up a new topic, then the discussion is not being used formatively.
Ask participants to examine the chart on p. 62 of “Formative Assessment and
Assessment for Learning,” and make notes on it about:
1.
how their students are currently assessed (they can include district and school-wide assessments as well as their own classroom assessments)
2.
whether and how those assessment results are used to inform learning
Module 3: Assessment
Unit 2: Session 2
Page 85
Unit 2: Session 2
Page 86
When they have had about 10 minutes to examine the chart and make notes, ask them to turn and talk to a partner about what they learned from considering assessment in this way, and what they might be interested doing to increase their use of formative assessment for learning in their classes. They should be encouraged to use the list of strategies they identified in the warm-up activity.
Apply some of the thinking that has happened in the session to examples of classroom assessments.
1. Divide into three groups.
2. Refer to pages of Chappius, Chappius, & Stiggins
Group 1: pp. 65-66, “Where am I going?”
Group 2: p. 66, “Where am I now?”
Group 3: p. 66, “How do I get there?”
3.
Examine sample assessment that is designed as an assessment of learning.
4.
Generate ideas, based on formative assessment for learning, for how to help students answer the group’s given question (e.g., for question 1, the group may talk about what the teacher could have done prior to the assessment, as well as how the assessment might be redesigned to make it clearer to the student what the goals of the assessment are).
AFTER THE SESSION (FOR NEXT TIME…)
After the session, instruct participants to:
Take one strategy you learned about today, and try implementing it in one of your classes.
Come to the next session prepared to share your experience.
REFERENCES
Black, P., & Wiliam, D. (March, 1998). Assessment and classroom learning.
Assessment in Education, 7-71.
Chappuis, S., & Stiggins, R. (2009). Formative assessment and assessment for learning, in Pinkus, 2009, Meaningful measurement.
Adolescent Literacy Facilitator’s Guide
Wylie, C. E. (2008). Formative assessment: Examples of practice. Washington,
DC: Council of Chief State School Officers.
ADDITIONAL RESOURCES
Andrade, H. and Cizek, G. J. (2009). Handbook of formative assessment. New
York, NY: Routlege.
Stiggins, R., Arter, J. A., Chappius, J., & Chappius, S. (2009). Classroom assessment for student learning: Doing it right-using it well. Boston:
Allyn & Bacon, Inc.
Black, P. (2003). Assessment for learning: Putting it into practice. Open
University Press
:
A 2-page brief of the book What teacher need to know about language, by
Fillmore, LW., & Snow, CE. (2000): http://www.cal.org/resources/digest/digest_pdfs/0006fillmoresnowwh at.pdf
Unit 2: Session 2
Module 3: Assessment Page 87
Unit 2: Session 3
Page 88
GUIDING QUESTION/OBJECTIVE
What role does assessment play in decision making related to improving adolescent literacy?
What can we do to enhance our assessment practices in the classroom?
Why is student participation essential to an effective classroom assessment process?
What can we do to increase student participation in assessment for learning?
To learn formative assessment strategies targeted at including students in all aspects of the assessment process
INTRODUCTION
Session 3 focuses on providing participants with assessment strategies targeted at involving students in the assessment process as active constructors of success criteria, as well as evaluators of their own work and progress toward learning goals. The session asks participants to consider how they might institute or enhance their current use of these and related strategies. Example strategies include: involving students in defining success
Note to Facilitators:
There are many activities in this session which will make it run longer than the other sessions. You may choose to make this into two sessions in order to give participants ample opportunity to discuss the approaches described in the readings as well as related approaches for including students in the assessment process.
Adolescent Literacy Facilitator’s Guide
criteria; involving students in creating assessments; training students in self- and peer evaluation; and portfolio use.
BEFORE THE SESSION
Read Stiggins, R. (May 2007). “Assessment through the student’s eyes.”
Educational Leadership, v. 64, n. 8, pp. 22-26. Retrieved April 19, 2010 from www.ascd.org/.../Assessment_Through_the_Student's_Eyes.aspx
Read Smith, K. (November 2009) “From test takers to test makers.”
Educational Leadership, v. 67, n. 3, pp. 26-30. Retrieved April 19, 2010 from http://www.ascd.org/publications/educational_leadership/nov09/vol67/n um03/From_Test_Takers_to_Test_Makers.aspx
Read Section 3 (A Four-Stage Model for Teaching Student Self-Evaluation) of Rodheiser, C. , & Ross, J. A. Student self-evaluation: What research says
and what practice shows. Retrieved April 19, 2010 from http://www.cdl.org/resource-library/articles/self_eval.php
.
Read p. 9 (chart called “Building Blocks to Pupil Self-Assessment”) in AAIA
Northeast Region. Self-assessment. Retrieved April 19, 2010 from www.aaia.org.uk/pdf/Publications/AAIAformat4.pdf
Read Paulson, F., Paulson, P., & Meyer, C. (1991). What Makes a Portfolio a Portfolio? (cover story). Educational Leadership, 48(5), 60-63. Retrieved
April 19, 2010 from www.stanford.edu/...portfolio/what%20makes%20a%20portfolio%20a%2
0portfolio.pdf
Consider Reading Skillings, M., & Ferrell, R. (2000). Student-generated rubrics: Bringing students into the assessment process. Reading Teacher,
53(6), 452. Retrieved from Professional Development Collection database.
NOTE: This is not available freely online but is a good piece. It is available through JSTOR and EBSCO, and is linked to an activity in the session.
Consider Reading Simpson, M. G., & Lovely, J. E. (April 2005). Peer and self-assessment strategies—A start! Retrieved April 19, 2010 from http://www.scribd.com/doc/4662920/Assessment-for-Learning-Peer-and-
Self-Assessment-Strategies-Teacher-Notes
Module 3: Assessment
Unit 2: Session 3
Page 89
Unit 2: Session 3
Consider Reading Barrett, H. (2007). Researching electronic portfolios and learner engagement: The REFLECT Initiative. Journal of Adolescent & Adult
Literacy, 50(6), 436-449. Retrieved from Professional Development
Collection database.
Print Session 3 of the Participant’s Resource Packet to be distributed at the beginning of Session 3.
Print Handout: pp. 75-104 of Torgeson, J. K., & Miller, D. H. (2009).
Assessments to guide adolescent literacy instruction. Portsmouth, NH:
RMC Research Corporation. Center on Instruction. # 11 at http://center-oninstruction.org/resources.cfm?category=reading&grade_end=12&grade_ start=4&subcategory=materials
(Note: This handout is for the extension activity. It is not in the
Participant’s Resource Packet)
Page 90 Adolescent Literacy Facilitator’s Guide
Participants
Read Stiggins, R. (May 2007). “Assessment through the student’s eyes.”
Educational Leadership, v. 64, n. 8, pp. 22-26. Retrieved April 19, 2010 from www.ascd.org/.../Assessment_Through_the_Student's_Eyes.aspx
Read Smith, K. (November 2009) “From test takers to test makers.” Educational
Leadership, v. 67, n. 3, pp. 26-30. Retrieved April 19, 2010 from http://www.ascd.org/publications/educational_leadership/nov09/vol67/num03
/From_Test_Takers_to_Test_Makers.aspx
Read Section 3 (A Four-Stage Model for Teaching Student Self-Evaluation) of
Rodheiser, C. , & Ross, J. A. Student self-evaluation: What research says and
what practice shows. Retrieved April 19, 2010 from http://www.cdl.org/resource-library/articles/self_eval.php
.
Read Paulson, F., Paulson, P., & Meyer, C. (1991). What Makes a Portfolio a
Portfolio? (cover story). Educational Leadership, 48(5), 60-63. Retrieved April 19,
2010 from www.stanford.edu/...portfolio/what%20makes%20a%20portfolio%20a%20portf olio.pdf
Consider Reading Skillings, M., & Ferrell, R. (2000). Student-generated rubrics:
Bringing students into the assessment process. Reading Teacher, 53(6), 452.
Retrieved from Professional Development Collection database. NOTE: This is not available freely online but is a good piece. It is available through JSTOR and
EBSCO
Consider Reading Simpson, M. G., & Lovely, J. E. (April 2005). Peer and selfassessment strategies—A start! Retrieved April 19, 2010 from http://www.scribd.com/doc/4662920/Assessment-for-Learning-Peer-and-Self-
Assessment-Strategies-Teacher-Notes
DURING THE SESSION
Ask participants if they can recall the three guiding questions of assessment for learning (“Where am I now?” “Where am I going?” “How do I get there?”). Link to this current session by noting that one goal we have as educators is to foster students’ ability to ask and answer these questions themselves—that is, to become independent learners. The most effective way to do this is to teach students to become active participants in the assessment process.
Unit 2: Session 3
Module 3: Assessment Page 91
Unit 2: Session 3
Page 92
This activity is intended to spur participants to think about assessment from their students’ points of view, an important shift of perspective if classroom teachers are to implement assessment for learning effectively. The activity directly connects with the Stiggins article, “Assessment Through the Student’s
Eyes.”
1.
Ask participants to look over the examples of assessment results in the
Participant’s Resource Packet (There are several examples listed under
Sample Assessment Results).
2.
Use a modified Circle of Viewpoints Discussion Protocol as outlined below to take student’s point of view in relation to the assessment. http://pzweb.harvard.edu/vt/VisibleThinking_html_files/03_ThinkingRo utines/03e_FairnessRoutines/CircleViewpoints/CircleViewpoints_Routin e.html
a.
Individuals in each group take turns providing their viewpoint, following this script: i.
I am thinking of ... Name the assessment sample...
From the point of view of ... Describe the student you are imagining. ii.
I think/feel ... Describe your thoughts & feelings about the assessment. Be an actor - take on the character of your viewpoint. iii.
A question I have from this viewpoint is ... ask a question from this viewpoint.
3.
Ask participants to take the Assessment for Learning Survey and the
Marking and Providing Feedback surveys (in Participant’s Resource
Packet).
In this part of the session, it is important to stress that fostering positive student experience of assessment, and having students participate productively in assessment creation, evaluation, and interpretation does not mean lowering standards. Instead, it asks educators to shift focus from assessment of learning to assessment for learning.
Adolescent Literacy Facilitator’s Guide
In assessment for learning, the standards, and the learning targets along the way that lead to proficiency, are made transparent in understandable language, models, and constant classroom partnerships in learning between teachers and students, and among students themselves. The goal is to nurture student’s independent learning and self-evaluation/self-monitoring skills. The objectives include: teaching students how to participate in defining criteria for success; creating tasks that assess progress toward mastering those criteria; providing descriptive feedback to themselves and their peers on how the task quality matches the success criteria; and reflecting on their own progress toward standards proficiency.
Use the Text Rendering Experience
( http://www.nsrfharmony.org/protocol/learning_texts.html
) to encourage students to glean the essential ideas from the Stiggins reading.
In this portion of the session, each grouping of slides focuses on an assigned reading and suggests a discussion strategy for thinking about how the ideas might be used in the classroom. Each of these strategies requires further reading, discussion, and practice to fully implement. The goal here is to increase awareness of these strategies and discuss them through the assessment lens, then to encourage participants to choose one to work toward implementing in their classrooms.
Reading, “From Test-Takers to Test Makers”
1.
Divide the group into three sub-groups.
2.
Ask each group to review the steps outlined in one section of the article
(sections listed on the slide).
3.
Groups should discuss how this approach might work in their classes, and what the challenges might be.
4.
Have groups report back to the full group.
If possible, obtain this useful article to distribute to participants during the session:
Module 3: Assessment
Unit 2: Session 3
Page 93
Unit 2: Session 3
Page 94
Skillings, M. J., & Ferrell, R. (March 2000). Student-generated rubrics: Bringing students into the assessment process. In The Reading Teacher v. 53, n.
6, pp. 452-458. (it is a “Consider Reading” category because it is not freely available on the web. It is available through JSTOR and EBSCO.)
Use the Four As Text Protocol to discuss “Student-Generated Rubrics”: http://www.nsrfharmony.org/protocol/learning_texts.html
The protocol is at the end of the Unit 3 Participant’s Resource Packet.
“Student Self-Evaluation: What Research Says and What Practice
Shows”
“Building Blocks to Pupil Self-Assessment” (In the Participant’s
Resource Packet)
Peer and self-assessment strategies—A start!
Use the “Save the Last Word for ME” protocol to discuss the ideas in these readings ( http://www.nsrfharmony.org/protocol/learning_texts.html
). This protocol is in the back of the Participant’s Resource Packet.
Reading: “What Makes a Portfolio a Portfolio?”
1.
Divide participants into groups of 3 or 4
2.
Ask them to nominate a scribe and a spokesperson
3.
Ask each participant in the group to take a turn pointing to one idea from the reading that sparked their thinking about the uses of portfolios in their classes.
4.
Invite the groups to discuss how they might institute or enhance the use of portfolios for learning in their classes.
5.
Reconvene the group and ask spokespeople to share the ideas from their group.
1.
Discuss Sample Assessment Systems from Torgeson pp. 75-104: Ten
Examples
Adolescent Literacy Facilitator’s Guide
2.
Divide into groups to read, discuss, and summarize the examples.
3.
Focus: a.
What common elements do these different systems share? b.
What can we learn from these that can guide us toward enhancing our own assessment systems?
Ask participants to complete the two surveys included in the Participant’s
Resource Packet (Assessment for Learning and Marking and Providing Feedback), then spend five minutes writing down the ideas and strategies they feel encouraged to try in their classes.
Note to Facilitators:
The surveys can be collected to inform plans for further professional development.
AFTER THE SESSION (FOR NEXT TIME…)
Encourage participants to try introducing and implementing one of these strategies, and to seek student feedback on the experience. Highlight that just as it takes reflection and time for teachers to shift their approaches to instruction, it also takes explicit instruction, guided practice, reflection and time for students to reevaluate their role in the classroom and invest in the active learning process.
REFERENCES
The Association for Achievement and Improvement through Assessment (May
2002) Secondary assessment practice: Self Evaluation and development
materials. www.aaia.org.uk/pdf/Publications/finalbooklet.PDF
The Association for Achievement and Improvement through Assessment,
Northeast Region. Self-assessment. www.aaia.org.uk/pdf/Publications/AAIAformat4.pdf
Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2004). Working Inside the Black Box: Assessment for Learning in the Classroom. (cover story).
Phi Delta Kappan, 86(1), 9-21. Retrieved from Professional Development
Collection database.
Module 3: Assessment
Unit 2: Session 3
Page 95
Unit 2: Session 3
Page 96
Black, P. and Wiliam, D. (November 6, 2001). Retrieved April 20, 2010 http://www.weaeducation.typepad.co.uk/files/blackbox-1.pdf .
Paulson, F. L., Paulson, P. P., & Meyer, C. A. (February 1991). “What makes a portfolio a portfolio?” in Educational Leadership, v. 8, n. 5, (pp. 60-63).
Retrieved from www.faculty.mchschool.org/sperloff/edtech/portfolioarticle2.pdf.
Rolheiser, C., & Ross, J. A. Student Self-Evaluation: What research says and what practice shows. http://www.cdl.org/resourcelibrary/articles/self_eval.php.
Simpson, M. G., & Lovely (April 2005). Peer and self assessment strategies: A start! www.scribd.com/.../Assessment-for-Learning-Peer-and-Self-
Assessment-Strategies-Teacher-Notes.
Skillings, M., & Ferrell, R. (2000). Student-generated rubrics: Bringing students into the assessment process. Reading Teacher, 53(6), 452. Retrieved from Professional Development Collection database.
Smith, K. (2009). From Test Takers to Test Makers. Educational Leadership,
67(3), 26-30. Retrieved April 19, 2010 from http://www.ascd.org/publications/educational_leadership/nov09/vol67
/num03/From_Test_Takers_to_Test_Makers.aspx.
Stiggins, R. (April 2008). Assessment manifesto: A call for the development of balanced assessment systems. Portland, OR: ETS Assessment Training
Institute.
Stiggins, R. (May 2007). “Assessment through the student’s eyes.” Educational
Leadership, v. 64, n. 8, pp. 22-26. Retrieved April 19, 2010 from www.ascd.org/.../Assessment_Through_the_Student's_Eyes.aspx
Stiggins, R. (1997). Student-centered classroom assessment. Columbus, OH:
Merrill. Cited in Student-generated rubrics: Bringing students into the assessment process. Reading Teacher, 53(6), 452. Retrieved from
Professional Development Collection database.
Torgeson, J. K., & Miller, D. H. (2009). Assessments to guide adolescent literacy
instruction. Portsmouth, NH: RMC Research Corporation. Center on
Instruction. # 11 at http://center-oninstruction.org/resources.cfm?category=reading&grade_end=12&grade
_start=4&subcategory=materials
ADDITIONAL RESOURCES
The Association for Achievement and Improvement through Assessment http://www.aaia.org.uk/assessment/assAAIAPub.asp
Adolescent Literacy Facilitator’s Guide
International Reading Association. Promoting Student Self-Assessment. http://www.readwritethink.org/professional-development/strategyguides/promoting-student-self-assessment-30102.html#research-basis
http://rubistar.4teachers.org/
http://www.teach-nology.com/web_tools/rubrics/
http://center-on-instruction.org
Association of Assessment Inspectors and Advisors: Secondary
Assessment Practice
www.aaia.org.uk/pdf/Publications/finalbooklet.PDF
Unit 2: Session 3
Module 3: Assessment Page 97
Unit 3: Overview
Page 98
Unit 3 is appropriate for all interested educators, and specifically addresses students at risk for poor academic performance. As such, the unit is especially important for literacy specialists, reading specialists, inclusion teachers, resource room teachers, and teacher aides—other teachers who provide academic support and formative assessment of students at risk. This unit assumes familiarity with assessment topics covered in earlier units.
INTRODUCTION
Unit 3 addresses assessment for students at risk of learning difficulty and poor achievement. Assessment for this group of learners is of the highest importance. Lack of disciplinary knowledge and literacy skill deficits will not vanish on their own or with increased maturity—even with exposure to good teaching in an enriched curriculum. Students at risk need to be supported and their areas of need remediated if they are to progress toward proficiency standards that reflect college-level and 21 st century learning skills. Unit 3 is comprised of three sessions.
Session 1: In order for an effective assessment program to function, assessors must know what they are assessing and why. Session 1 of this unit addresses: types of literacy and essential literacy skills (a brief review of Session 1 in
Module 2); distinguishes between struggling learners and students with SLDs; reaffirms the need for a balance between formative and summative assessment practices; addresses the complex interplay between accommodation and remediation; and calls for an examination of the roles and responsibilities related to assessment (data collection, interpretation and recommendations, data-based instructional decision-making). Along with participants’ existing knowledge, an understanding of these topics provides the foundations from which they may contribute to creating a comprehensive and effective assessment program at the school.
Session 2: Students struggling with one or more language-based literacy skills
(listening, speaking, reading, writing) are generally identified by their poor performance on state tests and/or other standardized measures of achievement, as well as through all-school screening assessments for reading fluency and comprehension. Session 2 of this unit introduces literacy screening and diagnostic assessment. Participants have the opportunity to experience
Adolescent Literacy Facilitator’s Guide
three types of screening assessment (Curriculum-based ORF and Maze measures, and a test similar to a standardized reading assessment), and learn to identify students at risk through screenings and make initial recommendations for intervention groupings. Participants are then introduced to the specialized activity of diagnostic assessment. While administering and interpreting screening results is a relatively straightforward task, good diagnostic assessment requires a thorough knowledge of typical and atypical language development that is beyond the scope of these sessions. Several resources are provided to guide participants interested in gaining further knowledge in this area. The second part of Session 2 defines and provides examples of diagnostic assessment. Participants have the opportunity to analyze errors in a curriculumbased ORF passage and a writing sample, as well as to administer a Motivation to Read Profile.
Session 3: Identifying students at risk and assessing their particular learning needs is not enough, however. If schools are committed to closing achievement gaps by ensuring that all students are making effective progress toward proficiency, effective student progress must be carefully monitored, and changes in instruction instituted as needed, both at the individual instructional level within the tier, and at the intervention intensity level. Session 3 of this unit teaches participants how to monitor progress (set a baseline, calculate a slope, document data points, make instructional decisions). It is important to note here that effective progress monitoring is completely dependent upon effective instruction. A teacher may easily learn and implement the steps to monitoring progress effectively. However, if he or she lacks the professional background to provide structured, sequential, language-based interventions targeted to meet the individual student’s needs, the student will not make effective progress. It is essential that teachers providing remedial instruction possess not only knowledge and skill in assessment, but also possess thorough knowledge of the instructional programs they are using to remediate language skills. There are many outstanding programs for remediating deficits in receptive and expressive language skills, and they require fidelity of implementation for students to progress efficiently and effectively.
Unit 3: Overview
Module 3: Assessment Page 99
Unit 3: Session 1
Page 100
GUIDING QUESTION/OBJECTIVE
What role does assessment play in decision making related to improving adolescent literacy?
In what ways can educators use assessment to improve outcomes for struggling learners?
What is the relationship between basic, intermediate, and disciplinary literacy, and why is it important?
What role does assessment play in identifying and ensuring that struggling students and students with learning disabilities develop proficient literacy skills?
What are the roles and responsibilities for educators working toward enhancing a useful and comprehensive assessment program?
To understand how a balanced and systematic assessment system can close literacy achievement gaps
INTRODUCTION
In this session, participants will review basic and disciplinary literacies, identify language skills that underlie these literacies, and examine the roles and responsibilities that underlie an effective and balanced assessment system that guides instruction for struggling learners.
Adolescent Literacy Facilitator’s Guide
BEFORE THE SESSION
Read Carnegie Council on Advancing Adolescent Literacy (2010) A time to act: An agenda for advancing adolescent literacy for college and career
success. New York, NY: Carnegie Corporation of New York. “Essential
Elements of Literacy for Adolescent Learners” pp. 72-79. http://carnegie.org/publications/search-publications/pub/195/
Read National Joint Committee on Learning Disabilities. (June 2008).
Executive summary of adolescent literacy and older students with learning disabilities. http://www.ldonline.org/pdfs/njcld/Executive_Summary-
Adolescent_Literacy_Report.pdf
Read National Association of Secondary School Principals (2009). Putting assessment in the driver’s seat. Retrieved 3/18/10 from http://www.adlit.org/article/31350?theme=print
Consider Reading Denton, C., Bryan, D., Wexler, J., Reed, D., & Vaughn, S.
(2007). Effective instruction for middle school students with reading difficulties: The reading teacher’s sourcebook. Utexas System/Texas
Education Agency. Chapters 1-2. Retrieved 3/31/10 from http://www.meadowscenter.org/vgc/materials/middle_school_instructio n.asp
Consider Reading Quenemoen, R. (June 2009). “Students with disabilities:
Expectations, academic achievement, and the critical role of inclusive standards-based assessments in improving outcomes.” In Pinkus, L. M., ed. (June 2009). Meaningful measurement: The role of assessments in
improving high school education in the twenty-first century. Washington,
DC: Alliance for Excellent Education. pp. 157-181.
Print Copies of the participant readings for Session 1 to distribute prior to the session.
Print Copies of the participant readings for Session 2 to distribute after the session.
Print The Session 1 Participant’s Resource Packet to be distributed at the beginning of the session.
Module 3: Assessment
Unit 3: Session 1
Page 101
Unit 3: Session 1
Page 102
Participants
Read Torgeson, J. K., & Miller, D. H. (2009). Assessments to guide adolescent literacy instruction. Portsmouth, NH: RMC Research Corporation, Center on
Instruction, pp. 3-17 (Please bring the whole document with you to Session 1, as there will be an in-class activity centered on pp. 18-56) www.centeroninstruction.org/files/Assessment%20Guide.pdf
Read Carnegie Council on Advancing Adolescent Literacy (2010) A time to act:
An agenda for advancing adolescent literacy for college and career success. New
York, NY: Carnegie Corporation of New York. “Essential Elements of Literacy for
Adolescent Learners” pp. 72-79. http://carnegie.org/publications/searchpublications/pub/195/
Read National Association of Secondary School Principals (2009). Putting assessment in the driver’s seat. Retrieved 3/18/10 from http://www.adlit.org/article/31350?theme=print
DURING THE SESSION
Based on reading from A Time to Act “Essential Elements of Literacy for Adolescent Learners” pp. 72-79.
As a group, review the elements of literacy by using the handout,
Elements of Literacy. Ask participants to define, orally, each term, and then fill out the sheet with several examples of how that element contributes to literacy skill. For example, under
“Phonemic Awareness,” participants might write the following:
Basic Literacy: students need this in order to read words and to spell words, as well as to understand spoken words, and speak words correctly.
Disciplinary Literacy: students need this to read classroom texts, to understand spoken vocabulary, and to spell in their writing.
The slides in this section of the session provide an introductory framework to thinking about levels of literacy skills and the role that remediation and
Adolescent Literacy Facilitator’s Guide
differentiation/accommodations/modifications play in ensuring that all learners gain proficiency. The “Matthew Effect” is an important concept to discuss, especially in relation to adolescent struggling learners who have not received appropriate instruction to remediate their basic literacy needs. Some of these students will have significant gaps not only in basic skills, but also in background knowledge that proficient readers gain from their exponentially wider exposure to reading over the years.
Some struggling learners have undiagnosed learning disabilities/differences, while others may struggle due to other disabilities, emotional issues, or environmental, cultural, or economic disadvantages. Some learners may struggle as a result of non-native language speaker barriers.
The term 'specific learning disability' (SLD) means a disorder in one or more of the basic psychological processes involved in understanding or in using language, spoken or written, which disorder may manifest itself in the imperfect ability to listen, think, speak, read, write, spell, or do mathematical calculations.
Such term includes such conditions as perceptual disabilities, brain injury, minimal brain dysfunction, dyslexia, and developmental aphasia. Such term
does not include a learning problem that is primarily the result of visual, hearing, or motor disabilities, of mental retardation, of emotional disturbance, or of environmental, cultural, or economic disadvantage.
United States Code (20 U.S.C. §1401 [30])
The term “language-based learning disability” refers to a subset of SLD which is characterized by average to superior cognitive ability but poor academic performance in one or more areas due to neurobiological differences that impede the understanding and/or use of language in the areas of listening, speaking, reading, and/or writing.
Both IDEA 2004 and IDEA 2004 federal regulations maintain the same definition of SLD as previous versions of the law and regulations.
While the definition of SLD remains unchanged in IDEA 2004, changes to the ways that schools can determine whether a student has an SLD are sure to have significant impact on school identification practices and procedures.
It is very important to underscore the point that literacy is more than reading skill even though most of the conversation about adolescent literacy focuses on reading. Much focus is placed on reading because it is one of the most common
Module 3: Assessment
Unit 3: Session 1
Page 103
Unit 3: Session 1
Page 104 difficulties for struggling students with and without learning disabilities. There is also a much larger research base concerned with identifying and remediating reading problems than there is on other areas of language development. Finally, in spite of the digital age in which we live, reading text continues to be the primary venue for learning new information.
Choosing and using assessments that provide useful information necessitates an understanding of WHAT we want to assess.
Literacy Skills consist of four areas of language skill: listening, speaking, reading, and writing. These skills develop interactively in relation to executive function.
Listening comprehension (part of the set of receptive language skills) underlies reading comprehension—fluent readers “translate” the written word into something approximating spoken language, which is why prosody in oral reading is a good indicator of comprehension.
Participants should be aware that persistent problems with reading comprehension, in spite of targeted interventions to teach reading strategies, may indicate a deeper issue of receptive language skill and call for diagnostic assessment in this area. Similarly, students who in spite of targeted interventions to develop fluency, fail to make effective progress and show difficulties in other literacy areas such as writing, may have a deeper need rooted in the expressive language process (that is, these students comprehend well, but have problems expressing their understanding fluently). In spite of
Torgeson’s (2009) arguments against formal diagnostic assessment, there is an important place for it in assisting educators to understand their student’s needs and tailor their instruction accordingly.
Listening (auditory discrimination and processing, vocabulary, semantics, syntax, non-verbal communication, etc.)
Reading (phonological awareness, visual discrimination, comprehension, etc.)
Speaking skills (articulation; semantics; syntax; word retrieval)
Reading fluency skills (phonological processing, rate, accuracy, prosody)
Adolescent Literacy Facilitator’s Guide
Writing skills (fine-motor, spelling, word retrieval, semantics, syntax)
Emphasize here that the goal is to get all students to proficiency levels in disciplinary literacy skills. In order for students to achieve proficiency at this advanced level of reading SKILL (as opposed to knowledge of the discipline gained through other avenues) they must have solidly developed decoding and reading comprehension skills.
In general, remediate means to start at the student’s current level and provide targeted instruction and progress monitoring to bring that level to proficiency.
Accommodate means that without changing the expectations for standards mastery, provide students what they need to demonstrate their proficiency
(e.g., audiobooks, graphic organizers, writing or note-taking templates, calculators, extended time, etc.)
Modify means that in order for students to access grade-level disciplinary knowledge and core expectations for learning, some changes to curriculum and expectation may be called for such as providing notes of lectures, summaries of discussions, texts that cover the essential ideas but are written at the student’s reading level, shorter assignments, etc.)
There are many points of view about what constitutes a modification vs. accommodation. The point is that the expectations for struggling students must remain high without shutting the student down in hopelessness. It is essential for participants to know that the way to do this effectively is for the overall learning goals (both skills and knowledge) and the learning targets along the way to be clearly identified by the teacher, and made comprehensible to the student.
Often at the middle and high school levels, students do not receive remediation for their reading difficulties, but rather support, often in the form of accommodations and modifications, to ensure that they can participate in the curriculum with their non-struggling peers. Differentiated instruction and universal design for learning both address the inclusion of all learners in the curriculum. This support is extremely important and ensures that these students continue to be challenged to think at high levels even if they cannot read the texts fluently. HOWEVER, this support should be alongside a remedial program whose goal is to increase reading skills to at least grade level so that
Module 3: Assessment
Unit 3: Session 1
Page 105
Unit 3: Session 1
Page 106 accommodations and modifications are no longer necessary, and the student can access the curriculum independently.
In addition, it is also important to remember that: “Test results (of reading) can mislead teachers into believing that failures in content skills at lower levels will preclude students from learning the subject at a higher level. This assumption can lead to erroneous tracking decisions. For example, problems with computational skills do not necessarily lead to problems with the abstractions needed to master algebra” (2008, Geurin & Denti, p. 152).
If the appropriate interventions are not provided for struggling students, you get what is commonly called “The Matthew Effect” (a name that derives from the Christian gospel of Matthew—For to all those who have, more will be given, and they will have an abundance; but from those who have nothing, even what they have will be taken away.— Matthew 25:29, New Revised Standard Version.
The term is used in education to refer to the ever widening achievement gap between students who read proficiently and students who do not. Though the gap may be small at young ages, if appropriate and effective instruction is not provided, the gap widens profoundly. By the time students reach adolescence, the “gap” refers not only to reading skills, but also to gaps in general knowledge which results from a lack of reading.
Use the Text-Rendering Experience
( http://www.nsrfharmony.org/protocol/learning_texts.html ) as a structure for reviewing the reading of pp. 3-17 in Assessments to Guide Adolescent Literacy
Instruction.
These two slides overview the basics of assessments that should be in place in a comprehensive literacy program. They do not focus on the related essentials of school culture and expectations, or data interpretation, management and communication.
Professional development
Target instruction for teachers so they may knowledgeably select, administer, and interpret assessment data, and implement instruction with fidelity.
Adolescent Literacy Facilitator’s Guide
Screening/Entry assessment
Establishes student skill levels
Formal and informal diagnostic assessment
Identifies specific areas of difficulty that are impeding progress (and sometimes identifies the causes of those difficulties)
Ongoing formative assessment for learning
Align student skill levels, content objectives and standards and lead to appropriate and timely instruction
Differentiated instruction/RTI
Targeted responses to group and individual assessment data
Progress Monitoring
i
Effects of new instruction and need for alternative/more intensive n t r u c students’ proficiencies in their first language (L1). Since L1 t reading comprehension and writing skills are facilitative and i predictive of L2 literacy acquisition (Genesee et al., 2006), educators need to know their students’ skills in order to provide to more accurately assess their ELLs literacy skills to provide effective instruction.
For content and standards for ELLs in MA, please see the
English Language Proficiency Benchmarks and Outcomes for
English Language Learners: http://www.doe.mass.edu/ell/benchmark.pdf
Unit 3: Session 1
Module 3: Assessment Page 107
Unit 3: Session 1
Page 108
This is an in-class reading and discussion activity of pp. 18-56 of Assessments to
Guide Adolescent Literacy Instruction. The reading is an important element for participants, particularly those who have not participated in Units 1 or 2 of the
Assessment Module. It will take about an hour.
Use a jigsaw model for the discussion part of this activity.
Divide the group into 5 smaller groups of equal number of people.
(Group 1 has the most # of pages, so ask participants who are fast/efficient readers to volunteer for that group).
Ask each group to read the selected pages, and extract and note (or highlight in the text) the main ideas from their section, clarifying any questions. The goal is for the group to agree on the key points/ideas of the section. Let them know that in the second part of this activity, each group member will be responsible for reporting out to a different group of colleagues.
When the groups are finished (about 20 minutes), have them reform into different groupings in which there is one representative for each section of the reading. Each group member in the new group should summarize his or her section, and answer questions from the other group members (about 30 minutes).
Group 1 pp. 18-section break on 31
Group 2 pp. 31-section break on 38
Group 3 pp. 38-section break on 43
Group 4 pp. 43-section break on 48
Group 5 pp. 48-56
This section asks participants to look at the existing assessment system in the school/district and identify strengths and needs. Literacy teams may have done a similar activity using the DESE guidance document Guidelines for Developing an Effective District Literacy Action Plan. This particular activity asks participants
Adolescent Literacy Facilitator’s Guide
to consider different elements of the assessment process for both basic and disciplinary literacy, to examine who is responsible for those elements, and, after linking roles and responsibilities, to make notes about the school’s/district’s strengths and needs in terms of:
Educators working together
Screening and formative assessments that provide needed data
Systematic interpretation that informs instruction
Progress monitoring
Groupings for this activity can be done in a variety of ways. The simplest may be to create one group of participants who deal primarily with the remediation of basic literacy skills, and another group of participants who focus on supporting struggling learners in the disciplinary classroom.
AFTER THE SESSION (FOR NEXT TIME…)
Review from Session 1 Torgeson, J. K., & Miller, D. H. (2009). Assessments to guide adolescent literacy instruction. Portsmouth, NH: RMC Research
Corporation, Center on Instruction. “The Role of Screening and Diagnostic
Assessments in a Comprehensive Assessment Plan,” pp. 48-56.
Read Heller, R. “Make it a priority to assess students’ literacy skills.
Read An introduction to curriculum-based measurement/curriculumbased assessment.
Read
(with permission from the author).
Unit 3: Session 1
Module 3: Assessment Page 109
Unit 3: Session 1
Page 110
Review Shefelbine, J. Framework for reading. Retrieved 3/15/10 from
Find out what assessments are used to measure students’ reading fluency and comprehension.
Find out what assessments are in place to measure spoken and written language skills.
REFERENCES
Carnegie Council on Advancing Adolescent Literacy (2010) A time to act: An
agenda for advancing adolescent literacy for college and career success.
New York, NY: Carnegie Corporation of New York.
Denti, L., and Guerin, G., eds. (2008). Effective practice for adolescents with
reading and literacy challenges. New York, NY: Routledge.
Denton, C., Bryan, D., Wexler, J., Reed, D., & Vaughn, S. (2007). Effective instruction for middle school students with reading difficulties: The
reading teacher’s sourcebook. Utexas System/Texas Education Agency.
Retrieved 3/31/10 from http://www.meadowscenter.org/vgc/materials/middle_school_instructi on.asp
Deshler, D. D., Palincsar, A. S., et al. (2007). Informed choices for struggling adolescent readers: A research-based guide to instructional programs
and practices. International Reading Association.
Quenemoen, R. (June 2009). “Students with disabilities: Expectations, academicachievement, and the critical role of inclusive standards-based assessments in improving outcomes.” In Pinkus, L. M., ed. (June 2009).
Meaningful measurement: The role of assessments in improving high
school education in the twenty-first century. Washington, DC: Alliance for Excellent Education. pp. 157-181.
National Joint Committee on Learning Disabilities (June 2008). Executive summary of adolescent literacy and older students with learning disabilities. Accessed 3/22/10 http://www.ldonline.org
Specific Learning
Disability. United States Code (20 U.S.C. §1401 [30])
Shanahan, T., & Shanahan, C. (2008). Teaching disciplinary literacy to adolescents: Rethinking content-area literacy.
Harvard Educational
Review, 78(1), 40-59.
Wren, S. (August 7, 2003). Matthew effects in reading. http://www.balancedreading.com/matthew.html
Adolescent Literacy Facilitator’s Guide
ADDITIONAL RESOURCES
Denti, L., and Guerin, G., eds. (2008). Effective practice for adolescents with reading and literacy challenges. New York, NY: Routledge.
Pinkus, L. M., ed. (June 2009). Meaningful measurement: The role of assessments in improving high school education in the twenty-first century. Washington, DC: Alliance for Excellent Education.
ELL REFERENCES
Genesee, F., Geva, E., Dressler, C., & Kamil, M. (2006). Synthesis: Cross-linguistic relationships. In D. August & T. Shanahan (Eds.), Developing literacy in second-language learners: Report of the National Literacy Panel on
language-minority children and youth (pp. 153–174). Mahwah, NJ:
Lawrence Erlbaum Associates.
Unit 3: Session 1
Module 3: Assessment Page 111
Unit 3: Session 2
Page 112
GUIDING QUESTION/OBJECTIVE
What role does assessment play in decision making related to improving adolescent literacy?
In what ways can educators use assessment to improve outcomes for struggling learners?
What is screening and why do we do it?
What is diagnostic assessment and why do we do it?
What kinds of screening and diagnostic assessments can we use?
To learn about the roles that screening and diagnostic assessment play in guiding instruction for struggling students and students with learning disabilities
To practice administering curriculum-based measurements for reading
INTRODUCTION
This session introduces literacy screening as a recommended practice for all students. Initial literacy screening should be followed up by targeted screening and other assessments to gain diagnostic information about students whose scores put them at risk for poor academic performance. The session introduces curriculum-based measurement (CBM) and provides the opportunity for participants to administer screenings to their colleagues. It also introduces examples of commercially available screening and diagnostic assessments related to literacy skills as well as other assessments targeted at gaining
Adolescent Literacy Facilitator’s Guide
diagnostic information related to the literacy-related affective and experiential domains of students.
ELL Connections:
It is important to understand that second-language students’ trajectories in literacy acquisition are similar but not the same as monolingual students. If adolescent
ELLs are from non-literate backgrounds, it is important to know that they navigate the early stages of reading
(decoding, word recognition for instance) with similar success as English-only (EO) students (Lesaux et al.,
2006). It takes considerably longer however, for ELLs to acquire successful reading comprehension skills. In screening and diagnostic assessments, it is important to assess the learner’s native-language and secondlanguage proficiencies based on tests normed with similar students, and recognize that reading and writing difficulties may be due to lack of proficiency and not a learning disability. A helpful resource on what to consider when placing ELLs in an appropriate program of instruction, visit: http://www.colorincolorado.org/article/14317
BEFORE THE SESSION
Review from Session 1 Torgeson, J. K., & Miller, D. H. (2009). Assessments to guide adolescent literacy instruction. Portsmouth, NH: RMC Research
Corporation, Center on Instruction. “The Role of Screening and Diagnostic
Assessments in a Comprehensive Assessment Plan,” pp. 48-56.
Read Heller, R. “Make it a priority to assess students’ literacy skills. http://www.adlit.org/adlit_101/improving_literacy_instruction_in_your_s chool/make_it_a_priority_to_assess_students_literacy_skills
Read An introduction to curriculum-based measurement/curriculumbased assessment. http://www.specialconnections.ku.edu/cgibin/cgiwrap/specconn/main.php?cat=assessment§ion=cbm/main
Module 3: Assessment
Unit 3: Session 2
Page 113
Unit 3: Session 2
Page 114
Read
http://progressmonitoring.org/RIPMBackgroundInfo2.html
Review Shefelbine, J.. Framework for reading. Retrieved 3/15/10 from http://www2.ed.gov/teachers/how/tools/initiative/summeworkshop/vald es/edlite-slide003.html
Review Pitcher, S., Albright, L., DeLaney, C., Walker, N., Seunarinesingh,
K., Mogge, S., et al. (2007). Assessing adolescents' motivation to read.
Journal of Adolescent & Adult Literacy, 50(5), 378-396. http://www.education.txstate.edu/ci/people/faculty/Delaney/contentPar agraph/04/document/Delaney+3.pdf
Review Morsy, L., Kieffer, M., & Snow, C. (2010). Measure for measure: A critical consumers’ guide to reading comprehension assessments for adolescents. New York, NY: Carnegie Corporation of New York.
Review International Reading Association (2008). A critical analysis of eight informal reading inventiories. Retrieved 3/15/10 from http://www.adlit.org/article/23373?theme=print
Review Section 1, Schoolwide Screening, in Johnson, E., Mellard, D. F,,
Fuchs, D., & McKnight, M. A. (2006). Responsiveness to Intervention (RTI):
How to do it. Lawrence, KS: National Research Center on Learning
Disabilities . http://nrcld.org/rti_manual/index.html
Read Torgeson, J., Houston, D., Rissman, L. (2007). Improving literacy
instruction in middle and high schools: A guide for principals. Portsmouth,
NH: RMC Research Corporation, Center on Instruction. Retrieved 4/1/10 from http://www.centeroninstruction.org/resources.cfm?category=reading&gr ade_end=12&grade_start=6&subcategory=materials
Print Copies of participant readings for Session 3 to distribute at the end of the session.
Print Session 2 Participant’s Resource Packet to be distributed at the beginning of the session.
Adolescent Literacy Facilitator’s Guide
Print Article for use in class: Assessing Adolescents’ Motivation to Read http://www.education.txstate.edu/ci/people/faculty/Delaney/contentPar agraph/04/document/Delaney+3.pdf
Print Handouts that are not included in the Participant’s Resource Packet:
Assessment Vocabulary Probe (attached to Facilitator’s Guide)
CBM Handout for Oral Reading Fluency (attached to Facilitator’s
Guide)
CBM Handout for Maze Passages (attached to Facilitator’s Guide)
“The Slash Test” Handout (attached to Facilitator’s Guide)
Unit 3: Session 2
Module 3: Assessment Page 115
Unit 3: Session 2
Page 116
Participants
Review from Session 1 Torgeson, J. K., & Miller, D. H. (2009). Assessments to guide adolescent literacy instruction. Portsmouth, NH: RMC Research
Corporation, Center on Instruction. “The Role of Screening and Diagnostic
Assessments in a Comprehensive Assessment Plan,” pp. 48-56.
Read Heller, R. “Make it a priority to assess students’ literacy skills. http://www.adlit.org/adlit_101/improving_literacy_instruction_in_your_school
/make_it_a_priority_to_assess_students_literacy_skills
Read An introduction to curriculum-based measurement/curriculum-based assessment. http://www.specialconnections.ku.edu/cgibin/cgiwrap/specconn/main.php?cat=assessment§ion=cbm/main
Read Deno, S. L. (2003). Curriculum-based measures: Development and perspectives. http://progressmonitoring.org/RIPMBackgroundInfo2.html
DURING THE SESSION
In Session 1, participants were asked to identify what assessments are used to measure reading fluency, comprehension, spoken language skills, and written language skills. As participants share examples, consider categorizing them on a whiteboard, smartboard, or poster board for later additions and to reference at the end of the session when participants will be asked to further investigate assessments.
The Assessment Vocabulary Probe is a sample of a curriculum-based vocabulary measure. These terms are selected from the Assessment Module. The
Vocabulary Probe can be used in several ways—here it provides the facilitator with a sense of what terms may need review, and it provides participants with a focus on key assessment vocabulary. The list (2 10-item matching probes) is not exhaustive. If a measure such as this were to be used to monitor student progress within a content classroom, the teacher would prepare a master list of all terminology for the year, and select items from it (10-20) to create matching probes to be administered on a regular basis. This is different from a traditional vocabulary test in that it does not “count” toward an achievement grade, but is used instead to guide teachers and students toward next instructional steps.
Adolescent Literacy Facilitator’s Guide
Slides 4
6 overview curriculum-based measures as well as examples of commercially available screening assessments.
Screenings for listening, speaking, and writing can also be administered, though much of the instruction on informal screening is focused on reading.
Students who are reasonably fluent on measures of oral reading, but show difficulty on achievement tests (MCAS) and on other assessments of reading comprehension (e.g., maze passages), may have comprehension challenges that need to be addressed with an enriched program targeted at developing vocabulary and reading comprehension strategies.
Students who score below average on measures of oral reading and below average on achievement tests (MCAS) and on other assessments of reading comprehension, are likely to need significant intervention in basic literacy skills.
The further targeted assessment of this group is essential in order to ensure that instruction is appropriate to their needs. For example, some of these students will have excellent comprehension ability that is easily masked by their decoding weaknesses. These students will need targeted instruction to develop their decoding fluency. Other students in this group may struggle with both decoding and comprehension and require a significantly different intervention that addresses not only decoding, but also general receptive language skills and targeted reading comprehension skills.
1.
Go over the instructions for CBM as a group and answer questions.
2.
Ask participants to pair up. Make sure that at least one of the pair has a watch with a second hand, or that there is a clock with a second hand in the room.
3.
Have the pairs assess each other. There are 4 reading passages in the handout. Each has a reader form and a teacher form. Participants should take turns being the reader, and then being the tester. When it is the tester’s turn to become the reader, he or she should not read the passages they just scored for the other person. The tester should administer the assessment, telling the reader he/she has 1 minute to read the passage quickly and accurately. Any missed words, substituted words, added words, and mispronunciations should be noted on the tester’s sheet. The score is obtained by subtracting the number of errors from the number of correctly read words.
Module 3: Assessment
Unit 3: Session 2
Page 117
Unit 3: Session 2
Page 118
4.
Each tester should administer two (2) ORF passages to the reader, marking errors as the reader reads.
Scoring: NOTE: While a tester would usually score the screening, it is recommended in this setting that they return the marked passages to the readers so that the readers score themselves rather than be scored by one of their colleagues. This will prevent any discomfort on the part of those participants who may not be fluent readers themselves.
Have participants follow the instructions for scoring their own oral reading fluency, then look at the charts provided to see where they fall on the 12 th grade fluency levels.
5.
Ask participants to pair up with a different partner to discuss this experience.
In this sample screening, because of time constraints, participants will administer only one (1) maze passage to each other even though in an actual screening, students would take two (2) and their scores would be averaged together.
1.
Go over the instructions for CBM Maze Passages as a group and answer questions.
2.
Ask participants to pair up in twos. Make sure that at least one of the pair has a watch with a second hand, or that there is a clock with a second hand in the room.
3.
Have the pairs assess each other.
4.
Scoring:
NOTE: It is recommended that the assessors provide the tester passage to the readers so that the readers score themselves with the answer key rather than be scored by one of their colleagues. This will prevent any discomfort on the part of those participants who may not be fluent readers themselves.
Have participants follow the instructions for scoring their own maze passages, then look at the charts provided to see where they fall on the 12 th grade maze chart.
5.
Ask participants to pair up with a different partner to discuss this experience.
Adolescent Literacy Facilitator’s Guide
Torgeson recommends that initial screenings guide initial intervention placement, with Group 1 receiving less intensive interventions focused primarily on vocabulary and comprehension skills, and Group 2 receiving very intensive interventions focused on decoding and comprehension.
Note: There are a variety of screening measures available. While CBMs are both valid and cost efficient, some schools may prefer to invest in a commercially available tool that comes with pre-made forms and instructions. The two examples in these slides are simply that—examples.
This assessment is recommended as convenient (i.e., group administered) and accurate measures to screen students at risk. The publisher notes: “The
TOSWRF accurately identifies students who are struggling with reading. It can also be used for monitoring reading progress and as a research tool. Because the test can be administered easily and quickly in a group format, it is an efficient and cost-effective screening method. The TOSWRF is not intended to be the sole measure for making eligibility or placement decisions; rather, it is best used as an initial screening measure to identify poor readers. Once students with poor reading skills have been identified, a more detailed diagnostic assessment can help determine the factors contributing to reading difficulties and the goals for intervention.”
Test of Silent Contextual Reading Fluency
The publisher notes: “The Test of Silent Contextual Reading Fluency (TOSCRF) provides a quick and accurate method to assess reading ability in children ages
7-0 through 18-11 and features four equivalent forms. Passages, adapted from the Gray Oral Reading Test and Gray Silent Reading Test, become gradually more complex in their content, vocabulary, and grammar (embedded phrases, sequenced adjectives, affixes, etc.) Each passage is presented as rows of contextually related words printed in uppercase without any spaces or punctuation (e.g., AYELLOWBIRDWITHBLUEWINGS). For each passage, students draw lines to separate as many words as they can in 3 minutes (e.g.,
A/YELLOW/BIRD/WITH/BLUE/WINGS). To do well on the test, the student has to read the meaning of the text. The TOSCRF measures a student's essential contextual reading abilities and reliably identifies students who are struggling with reading. It can also be used to periodically monitor reading progress.”
Module 3: Assessment
Unit 3: Session 2
Page 119
Unit 3: Session 2
Page 120
Attached to this Facilitator’s Guide is a reading task that approximates what students would be asked to do on the Test of Silent Contextual Reading Fluency.
Actual examples from the tool are not available for public use. After participants do this assessment, the original paragraph may be read aloud. At the conclusion of the activity, participants may want to discuss the differences between this assessment and the CBM maze passages.
The original paragraph:
The task of improving adolescent outcomes on critical national or state assessment measures is extremely challenging…even daunting! If SIM is to represent a promising solution for schools, it must be implemented in a way that will bring about significant changes in student behavior. During the past few years SIM implementers and CRL researchers have identified three factors that have been present when significant student gains have been made.
(Deshler, 2004).
This activity asks participants to keep in mind that screening assessments do not take the place of individualized diagnostic information that is required to guide instructional interventions.
1.
Ask participants to look at this chart and talk about what parts of reading were assessed on the CBM sample screenings (and on “The
Slash Test” if that was distributed and discussed). The image on the slide is also in the Participant’s Resource Packet.
2.
Ask participants to talk about what kinds of data are important to guide instruction, but are NOT provided by screening.
Further, targeted screening of students at-risk should be done to guide instruction
Decoding single words, sight words, reading rate
Adolescent Literacy Facilitator’s Guide
Background knowledge, vocabulary, syntax
Understanding text structure, self-monitoring making inferences
Targeted screening does not provide information about what is causing the difficulties, but it is important because it helps to focus instruction on the particular areas of need. A student who is struggling at the single word decoding level, for example, needs either focused phonological processing instruction or structured, sequential and individualized phonics instruction. Further diagnostic assessment will indicate these areas of need.
The slides in this section of the session emphasize the importance of gathering information about students’ area(s) of need in order to inform instruction.
While further assessment does take time, it ensures that instructional time is not wasted with students but is targeted specifically to their areas of need.
Ask participants to form three groups
They should review the sample scored CBM of oral reading fluency, and use the handout to categorize the oral reading errors. This categorization can assist in guiding reading instruction (not exclusively, but to focus on particular areas of fluency difficulty).
The recommended reading materials for participants include several analyses of literacy assessments that focus on reading. These slides provide examples of other formal assessments that may be used to guide instruction. The commercially available assessments included in these slides are examples of instruments that assess various areas that contribute to literacy development, and are useful in a diagnostic assessment process.
In addition to screening and gathering diagnostic information about students cognitive abilities and skills, we must not forget the other domains that profoundly influence academic performance—the affective and the experiential.
Most participants will readily acknowledge that student engagement and motivation are essential to making progress, and that students’ academic lives are shaped as much by the experiential domain as by the cognitive.
Module 3: Assessment
Unit 3: Session 2
Page 121
Unit 3: Session 2
Page 122
Adolescents’ motivation to read and reading self-concept can be assessed informally using the forms in the following article: Pitcher, S., Albright, L.,
DeLaney, C., Walker, N., Seunarinesingh, K., Mogge, S., et al. (2007). Assessing adolescents' motivation to read. Journal of Adolescent & Adult Literacy, 50(5),
378-396. http://www.education.txstate.edu/ci/people/faculty/Delaney/contentParagrap h/04/document/Delaney+3.pdf
Rick Lavoie, in his 2007 book The Motivation Breakthrough: 6 Secrets to Turning
on the Tuned- Out Child, includes many inventories that can be used with students to assess their overall motivational styles. If there is interest in student motivation, Lavoie’s DVD of the same title provides much fodder for teacher discussion.
Hildebrandt’s 2001 article, “But There’s Nothing Good to Read,” addresses librarians but is informative in terms of adolescent reading interests.
Assessment of the experiential domain for adolescents has a variety of factors, some of which crossover with the Motivation to Read Profile mentioned above and included in the handout.
In addition, as part of an overall approach to improving academic performance, assessments of learning styles and thinking styles can provide a tremendous amount of information that can help guide instruction and provide students with the vocabulary they need to advocate for themselves in any learning situation. There are numerous websites that purport to be inventories of learning and thinking styles, but either charge a fee or do not reflect assessments that have been developed, piloted and published by educational and psychological researchers. The most comprehensive and cost-free online assessment of learning and thinking styles (about 104 questions) requires only an email address to which results can be sent is at the Learning Disabilities
Resource Community site at http://www.ldrc.ca/projects/tscale/
This activity can be subject to time availability in the session. If the participants do not do the activity, they should receive the handout to read through in their own time, as it is a valuable process for gaining insight about their students.
Distribute the Motivation to Read Handout.
Ask participants to turn to pp. 381-2.
As the facilitator, read the instructions on p. 387, and tell participants to follow your instructions.
Adolescent Literacy Facilitator’s Guide
Ask participants to read the scoring instructions on pp. 389-90.
If there is time available, ask participants to engage in the
Conversational Interview.
Read over the assessor instructions for the conversational interview on p. 388, then look over the interview questions on pp. 383-6.
Ask students to pair up and choose who will be the assessor and who will be the student. Use part or all of the Conversational
Interview.
AFTER THE SESSION (FOR NEXT TIME…)
Read Johnson, E., Mellard, D. F., Fuchs, D., & McKnight, M. A. (2006).
Responsiveness to Intervention (RTI): How to do it. Lawrence, KS: National
Research Center on Learning Disabilities. http://nrcld.org/rti_manual/index.html Section 2, Progress Monitoring
(focus on pp. 2.1-2.9)
Read Fuchs, L. S., & Fuchs, D Progress monitoring in the context of responsiveness-to-intervention. Portsmouth, NH: RMC Research
Corporation, Center on Instruction. Retrieved 3/18/10 from http://www.centeroninstruction.org/files/plugin-
UsingCBMRTI_manual.pdf pp. 4-9.
Read Deno, S. (2003). Developments in Curriculum-Based Measurement.
Journal of Special Education, 37(3), 184-192. Retrieved from http://www.studentprogress.org/weblibrary.asp#data
Review Torgeson, J. K., & Miller, D. H. (2009). Assessments to guide adolescent literacy instruction. Portsmouth, NH: RMC Research
Corporation, Center on Instruction. “Challenges to the Successful Use of
Assessment for Instruction with Adolescents,” pp. 57-66 and “Part II: Ten
Examples of Assessments, or Assessment Systems, in Current Use or
Under Development to Guide Instruction in Adolescent Literacy,” pp. 75-
104.
Consider Taking the online thinking styles inventory at
Module 3: Assessment
Unit 3: Session 2
Page 123
Unit 3: Session 2
Page 124
Determine One aspect of your school assessment plan that needs work. Ask each participant to get information about one of the assessments mentioned and be ready to share out to the group about what the assessment tests, for what age group it would be appropriate, how long it takes, how much training is required, and the cost.
Create, administer and score A CBM oral reading assessment and maze assessment to a group of students and come to the next session prepared to discuss what you learned.
REFERENCES
Afflerbach, P. (2008). “Meaningful assessment of struggling adolescent readers.”
Pp. 249-264. In Lenski, S. and Lewis, J., eds. (2008). Reading success for
struggling adolescent learners. New York, NY: Guilford Press.
Davidson, M. (2008). The impact of response to intervention on secondary literacy. In 2008
Denti, L. and Geurin, G. eds. Effective practice for adolescents with reading and
literacy challenges. New York, NY: Routlege.
Deshler, D. (2004). We’ve been waiting for this moment—Are we ready? http://www.ldonline.org/article/We%27ve_Been_Waiting_For_This_M oment&%23133%3BAre_We_Ready%3F
Hildebrandt, D. (2001). But there’s nothing good to read. Media Spectrum: The
Journal for Library Media Specialists. http://ww2.mimame.org/member%20resources/Spectrum/f01hildebra ndt.pdf
Lavoie, R. (2007). The motivation breakthrough: 6 secrets to turning on the
tuned-out child. Touchstone.
Pitcher, S., Albright, L., DeLaney, C., Walker, N., Seunarinesingh, K., Mogge, S., et al. (2007). Assessing adolescents' motivation to read. Journal of
Adolescent & Adult Literacy, 50(5), 378-396. http://www.education.txstate.edu/ci/people/faculty/Delaney/contentP aragraph/04/document/Delaney+3.pdf
Shefelbine, J. Framework for Reading. Image captured coe.sfsu.edu/crlp/els.php
Torgeson, J. K., & Miller, D. H. (2009). Assessments to guide adolescent literacy instruction. Portsmouth, NH: RMC Research Corporation, Center on
Instruction. “The Role of Screening and Diagnostic Assessments in a
Comprehensive Assessment Plan,” pp. 48-56.
Adolescent Literacy Facilitator’s Guide
ADDITIONAL RESOURCES
McDonald, T., Thornley, C., Staley, R., & Moore, D. (2009). The San Diego
Striving Readers' Project: Building Academic Success for Adolescent
Readers. Journal of Adolescent & Adult Literacy, 52(8), 720-722.
Retrieved from Professional Development Collection database.
An introduction to curriculum-based measurement/curriculum-based assessment . http://www.specialconnections.ku.edu/cgibin/cgiwrap/specconn/main.php?cat=assessment§ion=cbm/main
Schoolwide Screening http://ldaofky.org/RTI%20Documents.htm (Under RTI
Manual, Section 1).
Florida Department of Education Diagnostic Instruments http://www.justreadflorida.com/educators/PrimSecDiagChart.asp
ELL REFERENCES
Lesaux, N., Koda, K., Siegle, L., Shanahan, T. (2006). Development of literacy. In
D. August & T. Shanahan (Eds.), Developing literacy in second-language learners: A report of the national literacy panel on language-minority children and youth (pp. 53-74). Mahwah, NJ: Lawrence Erlbaum
Associates.
Unit 3: Session 2
Module 3: Assessment Page 125
Unit 3: Session 2
Assessment Vocabulary Probe
Name:_____________________ Date:________________
Directions: Match the definition with the correct assessment term
1)
2)
3)
4)
5)
6)
7)
8)
9)
10)
holistic scoring
progress-based
norm-referenced
quantitative data
qualitative data
criterion-referenced
validity
achievement-based
reliability
grade
Score ___of 10 a. scores measure performance at the end of an instructional period; also called outcome assessment b. score reflects overall performance rather than scoring or analyzing individual dimensions c. scores indicate performance in relation to a set standard d. scores measure performance during an instructional period e. scores are determined by where they fall along a bell curve, with 1/2 the cohort falling above it and 1/2 below it f. scores measure what they are intended to measure g. scores are descriptive in nature h. scores are numerical in nature i. scores are stable across time and different examiners j. quantitative evaluation of standard(s) achievement
Page 126 Adolescent Literacy Facilitator’s Guide
11)
12)
13)
14)
15)
16)
17)
18)
19)
20)
Assessment Vocabulary Probe
Name:_____________________ Date:________________
Directions: Match the definition with the correct assessment term
Score ___of 10
interim
summative
authentic assessment
embedded assessment
performance-based
formative assessment
curriculum-based assessment
diagnostic assessment
curriculum-based measurement
screening a. assessment process designed to guide instructional decision making b. assessments designed to measure achievement c. Standardized assessments administered at set times during the school year to predict achievement on state tests d. short probes designed to identify students who may be at risk for poor performance e. assessments that provide evidence of student learning outcomes for the program are obtained from assignments in particular courses in the curriculum f. assessments designed to measure application of learning to a "real-life" task or the creation of something g. assessment process in which the behavior that the learning is intended to produce is evaluated and discussed in order to improve learning h. assessments designed to identify specific strengths and needs i. assessments in which instructional materials are used to measure learning along with direct observation of student performance j. a specific set of assessment procedures for monitoring student progress in basic skills
Unit 3: Session 2
Module 3: Assessment Page 127
Unit 3: Session 2
Distribute to participants. Ask them to take a minute to draw a slash line (/) to divide words. After the minute is over, read the passage aloud. Although this is not the actual TOSCRF assessment, it provides an idea of assessment experience.
The task of improving adolescent outcomes on critical nation or state assessment measures is extremely challenging even daunting. If the strategic instruction model is to represent a promising solution for schools it must be implemented in a way that will bring about significant changes in student behavior. During the past few years strategic instruction model implementers and center for research on learning researchers have identified three factors that have been present when significant gains have been made.
Note to Facilitators:
The three documents for this activity are located in the appendix.
See: ORF Reading Passage;
Maze Passage One; Maze
Passage Two
Page 128 Adolescent Literacy Facilitator’s Guide
GUIDING QUESTION/OBJECTIVE
What role does assessment play in decision making related to improving adolescent literacy?
In what ways can educators use assessment to improve outcomes for struggling learners?
What is progress monitoring?
How is progress monitoring done?
How should the data be used?
To learn what progress monitoring is and why it is essential to ensuring that struggling students and students with learning disabilities gain literacy proficiency
To learn how to monitor student’s progress in reading fluency
INTRODUCTION
Building on Session 2, this session focuses on progress monitoring of students’ literacy skill development. The concept is introduced, and participants engage in a progress-monitoring activity that asks them to create a baseline and slope, and chart student’s performance. While the setting, frequency, duration, and
type of instruction to be provided to students is not addressed in this session, participants do discuss how progress-monitoring indicates the need for changes in instruction.
Module 3: Assessment
Unit 3: Session 3
Page 129
Unit 3: Session 3
Page 130
ELL Connections:
For a helpful description of reading and writing proficiency for different levels of proficient ELLs, see the third section of the DESE ELPBO document,
“English Language Proficiency Level Descriptors,” available at: http://www.doe.mass.edu/ell/benchmark.pdf
. This can give educators a sense of the stages that language learners go through as they develop literacy skills at different age levels. More detailed information about the developmental stages of English language acquisition can be found at: http://www.colorincolorado.org/article/26751 or http://www.ascd.org/publications/books/108052/chapters/The_Stages_of
_Second_Language_Acquisition.aspx
BEFORE THE SESSION
Read Torgeson, J. K., & Miller, D. H. (2009). Assessments to guide adolescent literacy instruction. Portsmouth, NH: RMC Research
Corporation, Center on Instruction. “Challenges to the Successful Use of
Assessment for Instruction with Adolescents,” pp. 57-66 and “Part II: Ten
Examples of Assessments, or Assessment Systems, in Current Use or
Under Development to Guide Instruction in Adolescent Literacy,” pp. 75-
104.
Read Deno, S. (2003). Developments in Curriculum-Based Measurement.
Journal of Special Education, 37(3), 184-192. Retrieved from http://www.studentprogress.org/weblibrary.asp#data
Read Johnson, E., Mellard, D. F. , Fuchs, D., & McKnight, M. A. (2006).
Responsiveness to Intervention (RTI): How to do it. Lawrence, KS: National
Research Center on Learning Disabilities. Section 2, Progress Monitoring and Section 4, Fidelity of Implementation in http://nrcld.org/rti_manual/index.html
Read Florida Center for Reading Research (2009). Progress monitoring for middle and high school students. http://www.fcrr.org/assessmentMiddleHighSchool.shtm
Review Jenkins, J. R., Hudson, R. F., and Lee, S. H. (2007). Using CBM-
Reading Assessments to Monitor Progress. RTI Action Network. http://www.rtinetwork.org/Essential/Assessment/Progress/ar/UsingCBM/1
Print Session 3 Participant’s Resource Packet.
Adolescent Literacy Facilitator’s Guide
Participants
Read Johnson, E., Mellard, D. F., Fuchs, D., & McKnight, M. A. (2006).
Responsiveness to Intervention (RTI): How to do it. Lawrence, KS: National Research
Center on Learning Disabilities. http://nrcld.org/rti_manual/index.html
Section 2,
Progress Monitoring (focus on pp. 2.1-2.9)
Read Resources: Curriculum-Based Measurement at the Secondary-School Level http://www.progressmonitoring.org/RIPMProducts2.html#cbm_secondary
Read Deno, S. (2003). Developments in Curriculum-Based Measurement. Journal of
Special Education, 37(3), 184-192. Retrieved from http://www.studentprogress.org/weblibrary.asp#data
Review Torgeson, J. K., & Miller, D. H. (2009). Assessments to guide adolescent literacy instruction. Portsmouth, NH: RMC Research Corporation, Center on
Instruction. “Challenges to the Successful Use of Assessment for Instruction with
Adolescents,” pp. 57-66 and “Part II: Ten Examples of Assessments, or Assessment
Systems, in Current Use or Under Development to Guide Instruction in Adolescent
Literacy,” pp. 75-104.
DURING THE SESSION
Ask participants to share their learning from last session’s activity suggestions
(take the online thinking styles survey; administer CBMs to students); share information found on aspects of the school’s assessment plan and information about assessments.
O N T EACHING ; P ROGRESS M ONITORING D EFINITION ; P ROGRESS M ONITORING AND
S CREENING ARE D IFFERENT ; P ROGRESS M ONITORING IN T IERED I NSTRUCTION ; B ASIC
S TEPS TO P ROGRESS M ONITORING ; W HAT IS E FFECTIVE P ROGRESS ?
Most of the content in these slides reflect the reading “Progress
Monitoring.” Johnson, E., Mellard, D. F., Fuchs, D., & McKnight, M. A.
(2006). Responsiveness to Intervention (RTI): How to do it. Lawrence, KS:
National Research Center on Learning Disabilities. http://nrcld.org/rti_manual/index.html
Module 3: Assessment
Unit 3: Session 3
Page 131
Unit 3: Session 3
The facilitator and participants should use the Participant’s Resource Packet to complete this activity. It is recommended that participants work with a partner or in small groups, and that the facilitator circulate to answer questions.
Note to Facilitators:
It is important to explain to participants that most resources on progress monitoring and the “how to’s” use elementary school level examples; however, it is important to note that the actual process of obtaining baseline performance, setting growth goals, charting progress, and making changes in instruction is the same no matter what age the student is. The difference is in the content/skill level of the probe itself.
Participants should also be made aware that this process is applicable not only to ORF and Maze measures, but also to measurements of writing fluency, math fluency, and vocabulary.
Note to Facilitators:
It is difficult to find useful blank PM charts via the internet. Many educators find it easier to create their own templates, either on the computer or using graph paper. The blank chart included in the guide comes from interventioncentral.org, and provides a useful base for the activity. However, it only goes up to 140 WPM which is not enough to set an aimline for an upper level high school student.
Page 132 Adolescent Literacy Facilitator’s Guide
1.
Complete the survey on pp. 2.18-2.19 of the reading, “Progress
Monitoring.”
2.
Gather in groups by department, school, or grade level to discuss what is in place, and what the priorities for implementation are.
3.
In those same groups, work together to complete the needs assessment on p. 2.20 of the reading, “Progress Monitoring.”
Teams of teachers may work in groups to create a progress monitoring plan using Activity 2.1 of the “Progress Monitoring” reading (pp. 2.15-
2.17). A progress monitoring plan should include necessary resources for indicated changes in instruction.
AFTER THE SESSION (FOR NEXT TIME…)
Consider reconvening the group to provide further professional development.
Future sessions should aim at a dual goal—
Further educating participants on CBM and progress monitoring practices in the general curriculum, and supporting educators engaging in CBM and progress monitoring of students receiving interventions
Focused attention on analysis of progress related to specific instructional practices (e.g., What works and why? What doesn’t work and why?)
Module 3: Assessment
Unit 3: Session 3
Page 133
Unit 3: Session 3
Page 134
REFERENCES
Afflerbach, P. (2008). “Meaningful assessment of struggling adolescent readers.” pp. 249-264. In Lenski, S. and Lewis, J., eds. (2008). Reading success for
struggling adolescent learners. New York, NY: Guilford Press.
Denti, L., and Guerin, G., eds. (2008). Effective practice for adolescents with
reading and literacy challenges. New York, NY: Routledge.
Fuchs, L. S. & Fuchs, D. Progress monitoring in the context of responsiveness-to-
intervention. Retrieved from http://www.centeroninstruction.org/files/plugin-
UsingCBMRTI_manual.pdf
Hutton, J. B., Dubes, R., & Muir, S. (1992). Estimating trend progress in monitoring data: A comparison of simple line-fitting methods. School
Psychology Review, 21, 300–312.
Johnson, E., Mellard, D. F,, Fuchs, D., & McKnight, M. A. (2006). Responsiveness
to Intervention (RTI): How to do it. Lawrence, KS: National Research
Center on Learning Disabilities. http://nrcld.org/rti_manual/index.html
National Center on Student Progress Monitoring. www.studentprogress.org/admin1.asp
Pennsylvania Department of Education, Bureau of Special Education,
Pennsylvania Training and Technical Assistance Network. How to Create
a Graph for Progress Monitoring. Retrieved from http://www.pattan.k12.pa.us/files/ProgMon/GraphPM.pdf
Rogers, D. C. Creating baselines and aimlines by David C. Rogers. Special
Connections, University of Kansas. Retrieved from http://www.specialconnections.ku.edu/cgibin/cgiwrap/specconn/main.php?cat=assessment§ion=main&subse ction=cbm/baselines.
ADDITIONAL RESOURCES
Tools for CBM (including probe creation using your selected text) http://www.interventioncentral.org
Research Institute on Progress Monitoring. Resources: Curriculum Based
Measurement at the Secondary School Level. http://www.progressmonitoring.org/RIPMProducts2.html#cbm_secondary
Sitlington, P. (2008). Students with Reading and Writing Challenges: Using
Informal Assessment to Assist in Planning for the Transition to Adult
Life. Reading & Writing Quarterly, 24(1), 77-100. doi:10.1080/10573560701753153.
Adolescent Literacy Facilitator’s Guide
ELL REFERENCES
Colorín Colorado (2007). Placing English Language Learners in a Program of
Instruction. http://www.colorincolorado.org/article/26751
Hill, JD., & Björk, L. (2010). Classroom Instruction That Works with English
Language Learners Facilitator's Guide. ASCD. http://www.ascd.org/publications/books/108052/chapters/The_Stages
_of_Second_Language_Acquisition.aspx
SESSION 3 APPENDIX
Sample “Slash” Test, as noted on page 128. The following pages contain the
three documents for this activity:
ORF Reading Passage (8 pages)
Maze Passage One (4 pages)
Maze Passage Two (4 pages)
Unit 3: Session 3
Module 3: Assessment Page 135
Unit 3 Session 3
Page 136 Adolescent Literacy Facilitator’s Guide
Unit 3: Session 3
Module 3: Assessment Page 137
Unit 3 Session 3
Page 138 Adolescent Literacy Facilitator’s Guide
Unit 3: Session 3
Module 3: Assessment Page 139
Unit 3 Session 3
Page 140 Adolescent Literacy Facilitator’s Guide
Unit 3: Session 3
Module 3: Assessment Page 141
Unit 3 Session 3
Page 142 Adolescent Literacy Facilitator’s Guide
Unit 3: Session 3
Module 3: Assessment Page 143
Unit 3 Session 3
Page 144 Adolescent Literacy Facilitator’s Guide
Unit 3: Session 3
Module 3: Assessment Page 145
Unit 3 Session 3
Page 146 Adolescent Literacy Facilitator’s Guide
Unit 3: Session 3
Module 3: Assessment Page 147
Unit 3 Session 3
Page 148 Adolescent Literacy Facilitator’s Guide
Unit 3: Session 3
Module 3: Assessment Page 149
Unit 3 Session 3
Page 150 Adolescent Literacy Facilitator’s Guide
Unit 3: Session 3
Module 3: Assessment Page 151
Module 3 Leadership Roles and Responsibilities
Leadership plays a pivotal role in a school’s ability to establish and maintain the infrastructure necessary to support increased student learning and literacy. A report commissioned by The Wallace Foundation, Leithwood et al. (2004) found that, “Leadership is second only to classroom instruction among all schoolrelated factors that contribute to what students learn in school” (p. 5). The impact of leadership stems not only from the energy and direction that a strong leader can provide, but also from the extensive influence that leaders have on numerous other factors that affect learning. Biancarosa and Snow (2004, p. 12), for example, list six infrastructure elements necessary to improve student literacy learning at the middle and high school levels: 1) extended time for literacy instruction; 2) professional development opportunities to improve understanding and inclusion of literacy instructional practices; 3) ongoing summative assessment of student progress and programs; 4) professional learning communities, or teacher teams, to focus on student work and instructional planning; 5) active, engaged leadership to understand and support the literacy improvement plan; and 6) a comprehensive and coordinated literacy program crafted to the needs of students and teachers. While only number 5 explicitly describes the type of leadership necessary, each of the other infrastructure components requires effective leadership to make certain they are put into place.
Another review of successful school-wide efforts to improve adolescent literacy identified five key elements to success across settings: leadership, assessment, professional development, highly effective teachers, and intervention. Again, although “leadership” was identified as a single component (NASSP, 2005), it is also central to ensuring the success of all the others. This makes sense, of course.
Teachers, literacy coaches, and other school personnel cannot—by themselves: set and reinforce a vision of literacy improvement as central to the mission of middle and high schools; establish a schedule that creates time for interventions, time to read and write, and time for collaborative teacher and student work; provide professional development opportunities that are of high quality and are aligned to school improvement goals; or make sure that appropriate literacy assessments are in place and that the data from these are used to inform instruction. Ensuring that these elements are both present and coordinated is the function and responsibility of school leaders.
Page 152 Adolescent Literacy Facilitator’s Guide
Module 3 Leadership Roles and Responsibilities
Through a review of the research and practice literature, Irvin, Meltzer and
Dukes (2007) identified three primary focus areas for an adolescent literacy improvement initiative: student motivation, engagement and achievement;
integration of literacy learning (across the content areas and in terms of literacy interventions for struggling readers and writers); and sustaining literacy
improvement (through attention to school environment, culture and policies; working with parents and the community; and the reciprocal relationship between schools and districts relevant to improving literacy). This review also identified five key actions that leaders of successful school improvement efforts take: 1) develop and implement a data-driven literacy action plan, 2) support teachers as they improve instruction through a combination of supports and monitoring strategies, 3) use data to improve teaching and learning, 4) build leadership capacity among faculty, and 5) allocate resources.
Providing high-quality, ongoing content-area literacy instruction for all students and effective intervention support for struggling students requires a schoolwide effort and, therefore, the active involvement of many people. The practice literature confirms that when the responsibility for literacy improvement is in the hands of a single administrator or a literacy coach, the undertaking is likely to fail and is not sustainable when key personnel leave (Deshler, Deshler &
Biancarosa, 2007). Literacy coaches have emerged as potential leaders in many middle and high schools in efforts to improve classroom instruction and student learning, and sometimes, when hired, are seen as “the solution to the literacy challenge” (McKenna & Walpole, 2010). However a literacy coach cannot do this work alone. Even with active administrative support, a coach can only be a catalyst in helping everyone to better understand the issues, learn new approaches, and apply the approaches frequently within classes and widely across classrooms (Shanklin, 2007). Everyone has a role to play. The practice literature clearly indicates that a distributed leadership model appears to be most effective when putting a school-wide literacy improvement effort in place
(Kral, 2007; Ippolito 2009).
Many successful schools begin by creating a literacy leadership team made up of school personnel across a variety of roles. A team might include administrators, content teachers, resource teachers, literacy coaches, reading specialists, special education teachers, counselors, and media specialists. This literacy leadership team then takes responsibility for engaging in a five-stage literacy improvement process.
The process outlined below (Figure 1) is based on the key practices in the change and school reform literature as applied to improving adolescent literacy at the middle and high school levels, and can provide a model for how to initiate
Module 3: Assessment Page 153
Module 3 Leadership Roles and Responsibilities and sustain a literacy improvement effort in secondary schools (Irvin, Meltzer,
Dean & Mickler, 2010).
Get Ready
Sustain Assess
Page 154
Implement Plan
Figure 1. Five-Stage Literacy Improvement Process
Get Ready: In this step, school leaders establish the need for a focus on literacy improvement, develop a collective vision of literacy-rich teaching and learning, and assemble a representative literacy leadership team to organize and support the effort.
Assess: In this step, the literacy leadership team analyzes data relative to the strengths and challenges of the school’s capacity, supports the literacy development of all students relative to the Taking Action Literacy Leadership
Model (Irvin, Meltzer & Dukes, 2007), and develops measurable literacy improvement goals. Specific areas that address student needs and can be improved through collective collaborative effort become the focus of the plan.
Plan: In this step, the literacy leadership team develops detailed implementation maps that describe the action steps and timeline for how each literacy improvement goal will be reached. The map should include specific plans for professional development, a strategy for supporting that professional development, and detail of how teachers will be held accountable for implementing changes in practice, policy, or approach. The team then communicates the plan to the faculty and publishes the plan to ensure transparency.
Implement: Using the plan as an implementation guide, the literacy leadership team supports and monitors plan implementation, troubleshooting where necessary, and collecting evidence of progress toward goals.
Sustain: At the end of the year, the literacy leadership team reviews the progress that has been made, updates the goals and the plan as necessary, decides how momentum will be sustained, and celebrates accomplishments. A plan for how to “get ready” to continue the work in the fall is developed. In this
Adolescent Literacy Facilitator’s Guide
Module 3 Leadership Roles and Responsibilities way, the focus is maintained and literacy improvement remains a central focus into future years.
Each Module’s “Leadership Addendum” contains specific actions that the principal/assistant principal, literacy coach, and members of the literacy leadership team can take relative to the particular Module. In order for leaders to be well positioned to support the professional development associated with each Module, it is important for those leaders to be familiar with the content the Modules. The potential impact of the Modules will be maximized if leaders support the work initially and throughout experimentation with new instructional practices. The four Modules are:
Module 1: Adolescent Reading, Writing, and Thinking
Module 2: Content-Area Literacy
Module 3: Assessment
Module 4: Tiered Instruction
We have also provided suggestions for actions that leaders can take to extend aspects of the material addressed in each Module so that important and courageous conversations and powerful practices can be supported. The
Modules emphasize the use of discussion protocols (McDonald, Mohr, Dichter,
& McDonald, 2007) as one way to encourage facilitators and participants to engage in collegial conversations about instructional practice. Leaders’ participation in some of these discussions, and their further use of protocols, can encourage new ways of communicating in schools.
School literacy leaders often experience many competing time constraints.
However, improving adolescent literacy can be an effective lever for improving student learning and achievement and is well worth the required focused attention. Literacy leaders need to understand that the issues directly related to adolescent literacy present complex system challenges such as how to address the needs of ELLs, how to improve student motivation to read and write in school, and how to increase the amount and quality of content-area writing. In the past 20 years, researchers have learned a lot about student literacy learning, multiple literacies, and teaching– but educators do not always know what researchers have discovered. The information presented in these Modules is a great place for leaders, coaches, and teachers to begin to increase their knowledge and understanding of these and other key literacy issues. For a
Module 3: Assessment Page 155
Module 3 Leadership Roles and Responsibilities review of 16 issues that middle and high school educators identified as important to address and practical ideas for steps that leaders can take, see
Meeting the Challenge of Adolescent Literacy: Practical Ideas for Literacy
Leaders (Irvin, Meltzer, Mickler, Phillips, & Dean, 2009).
The Leadership Addendum for each Module provides literacy leaders with rich suggestions for beginning the discussion, planning, and implementing the key suggestions presented in the four MA ESE Adolescent Literacy Professional
Development Modules. Literacy leaders may use the Modules individually or collectively to address important issues related to content-area literacy, supportive instructional practices, data informed decision making, and effective intervention practices. Suggestions for specific actions that leaders can take before, during, and after the professional development process also address each of the five steps of the literacy improvement process. Leaders can use these materials to support the learning in the Modules and better prepare teachers and educational leaders to work together to improve literacy.
REFERENCES:
Biancarosa, G., & Snow, C. E. (2004). Reading next—A vision for action and research in middle and high school literacy: A report from Carnegie
Corporation of New York. Washington, DC: Alliance for Excellent
Education. Retrieved from http://carnegie.org/fileadmin/Media/Publications/PDF/ReadingNext.pdf
Deshler, R, Deshler, D. & Biancarosa, G. (2007). School and district change to improve adolescent literacy. In Deshler et al. (Eds.) Informed choices for
struggling adolescent readers (pp. 92-110). Newark, DE: International
Reading Association.
Ippolito, J. (2009). Principals as partners with literacy coaches: Striking a balance between neglect and interference. Literacy Coaching Clearinghouse.
Retrieved from http://www.literacycoachingonline.org/briefs/Principals_as_Partners.pdf
Irvin, J. L., Meltzer, J., & Dukes, M. (2007). Taking action on adolescent literacy:
An implementation guide for school leaders. Alexandria, VA: Association for Supervision and Curriculum Development.
Irvin, J., Meltzer, J., Mickler, M. J., Phillips, M., & Dean, N. (2009). Meeting the
challenge of adolescent literacy: Practical ideas for literacy leaders.
Newark, DE: International Reading Association.
Irvin, J., Meltzer, J., Dean, N., & Mickler, M. J. (2010). Taking the lead on
adolescent literacy: Action steps for schoolwide success. Thousand Oaks,
CA: Corwin Press.
Page 156 Adolescent Literacy Facilitator’s Guide
Module 3 Leadership Roles and Responsibilities
Kral, C. (2007). Principal support for literacy coaches. Literacy Coaching
Clearinghouse. Retrieved from http://www.literacycoachingonline.org/briefs/PrincipalSupportFinal3-
22-07.pdf
Leithwood, K., Louis, K. S., Anderson, S., & Wahlstrom, K. (2004). How leadership influences student learning. A report commissioned by the Wallace foundation Center for Applied Research and Educational Improvement
at the University of Minnesota. Retrieved from http://www.wallacefoundation.org
McDonald, J. P., Mohr, N., Dichter, A., & McDonald, E. C. (2007). The power of
protocols: An educator's guide to better practice (2nd ed.). New York:
Teachers College Press.
McKenna, M. C. & Walpole, S. (2010). Literacy coaching in the middle grades.
Adlit.org. Retrieved from http://www.adlit.org/article/36143
NASSP (2005). Creating a culture of literacy: A guide for middle and high school
principals. Reston, VA: National Association of Secondary School
Principals.
Shanklin, N. (2007). What supports do literacy coaches need from administrators in order to succeed? Literacy Coaching Clearinghouse.
Retrieved from http://www.literacycoachingonline.org/briefs/LCSupportsNSBrief.pdf
Module 3: Assessment Page 157
Module 3 Leadership Roles and Responsibilities
School leaders are well aware of the multiple sources of data available for making decisions. However, schools frequently lack a process for, and a focus on, using those data to make informed decisions about instruction, assessment, or curriculum. This often occurs because teachers and administrators have not been trained to dig beneath the surface to understand what the data are indicating about instructional practices or student learning. Darling-Hammond,
Ancess, and Falk (1995) perhaps best define what the focus on assessment should be with this statement: “It is the action around assessment—the discussion, meetings, revisions, arguments, and opportunities to continually create new directions for teaching, learning, curriculum, and assessment—that ultimately have consequences. The ‘things’ of assessment are essentially useful as dynamic supports for reflection and action, rather than as static products with value in and of themselves” (p. 18). Because school leaders have the capacity to create the time and structures that facilitate these actions around assessment, their leadership is essential to moving the data analysis process forward. Doing so will enable schools to get a clear picture of student strengths and weaknesses, identify teacher professional development and coaching needs, and understand the school’s capacity to support and sustain a schoolwide literacy initiative.
There are several key actions school leaders need to put into place:
Develop a comprehensive understanding of the assessments currently being used, and identify the specific information provided by those assessments.
Promote the use of multiple assessments to better understand literacy learning (Irvin et al, 2007).
Develop a collaborative, reflective professional culture where teachers meet to analyze data and to identify specific action steps based on formal and informal assessments.
Implement regular progress-monitoring sessions among the staff to discuss student progress as well as the progress of instructional change and suggested interventions.
Provide time within the schedule for teachers to analyze data and common assessments to determine if curricula and instructional practices are producing the desired outcome with student performance.
Page 158 Adolescent Literacy Facilitator’s Guide
Module 3 Leadership Roles and Responsibilities
Use available resources in the area of assessment to guide leadership and teachers with analyzing and making informed instructional/intervention decisions based on current data (NASSP,
2005; MA ESE District Data TeamToolkit, 2009; Boudette, K., City, E., &
Murnane, R., Ed., 2007; and Irvin et al, 2009).
Working with the School Literacy Leadership Team, school leaders should carefully create action steps toward developing a culture of data-driven decisions about instruction and intervention supports that students need in order to be successful.
REFERENCES
Boudett, K., City, E., & Murname, R., Ed. (2007). Data wise: A step-by-step guide to using assessment results to improve teaching and learning:
Cambridge, MA: Harvard Education Press
Darling-Hammond, L. , Ancess,J. & Falk,B.(1995). Authentic assessment in action: Studies of schools and students at work. New York: Teachers
College Press.
Irvin, J. L., Meltzer, J., & Dukes, M. (2007). Taking action on adolescent literacy:
An implementation guide for school leaders. Alexandria, VA: Association for Supervision and Curriculum Development.
Irvin, J., Meltzer, J, Mickler, M.J., Phillips, M., & Dean, N. (2009) Meeting the
Challenge of Adolescent Literacy: Practical Ideas for Literacy Leaders.
Newark, DE: International Reading Association
Massachusetts Department of Elementary and Secondary Education (2010).
District Data Team Toolkit—Version 1.0
NASSP (2005) Creating a culture of literacy: A guide for middle and high school principals. Reston, VA: National Association of Secondary School
Principals.
As literacy leaders, you may find it useful to consult additional resources for ideas and suggestions to support successful implementation of the ideas suggested in this Module. Table 1 provides information about how to organize and support the use of assessment data. The resources can also be used as professional book studies for the literacy leadership team.
Module 3: Assessment Page 159
Module 3 Leadership Roles and Responsibilities
Table 1: Resources for Literacy Leaders
Name of Resource Description
Irvin, J., Meltzer, J, Mickler, M.J., Phillips,
M., & Dean, N. (2009) Meeting the
Challenge of Adolescent Literacy: Practical
Ideas for Literacy Leaders. Newark, DE:
International Reading Association
Chapter 3, What Can Literacy Leaders Do
When Data Show That Most Students Do
Not Read or Write on Grade Level?
Carnegie Council on Advancing Adolescent
Literacy. (2010). Time to act: An agenda for advancing adolescent literacy for college
and career success. New York, NY: Carnegie
Corporation of New York.
Chapter 3: Data Collection and Use, pp. 30-33
MA ESE District Data Team Toolkit http://www.doe.mass.edu/sda/ucd/ddtt/Int roduction.doc
Boudett, K., & Steele, J. (2007). Data wise in action: Stories of schools using data to
improve teaching and learning. Cambridge,
MA: Harvard Education Press.
This chapter provides literacy leaders with suggestions and responsive action steps to put into place when students are reading and writing below expectations.
Information for literacy leaders who need a brief overview of types of assessments that can be used to inform program and instructional decisions.
This resource describes a process by which educators can collaboratively review and plan based on data.
NASSP (2005) Creating a culture of literacy:
A guide for middle and high school
principals. Reston, VA: National Association of Secondary School Principals
Chapter 3: Putting Assessment in the
Driver’s Seat, pp 19-29
NASSP (2006) Breaking ranks in the middle:
Strategies for leading middle level reform.
Reston, VA: National Association of
Secondary Principals
Chapter 4: Making learning personal:
curriculum, instruction, and assessment, pp.
175-240
Helpful suggestions, research, and a school profile provide literacy leaders with ideas of how to collect and use data effectively to support students and identify additional professional development support for teachers. School profile of highly successful J.E.B. Stuart High School in Falls Church, VA shows an example of schoolwide data-driven decision making to improve literacy learning.
NASSP (2004). Breaking ranks II: Strategies
for leading high school reform. Reston, VA:
National Association of Secondary Principals
Chapter2: Sowing the seeds for change:
Collaborative leadership, professional learning communities, and the strategic use
of data, pp. 19-66.
Provides an overview of how leadership, PLCs, and data use can be used to make critical educational decisions to facilitate change in schools.
Literacy leaders seeking information about how to effectively use data to personalize instruction are provided with concrete suggestions and examples in this chapter. Experts Carol Ann
Tomlinson and Tom Rudin provide additional insight about ability grouping and preparing students for college. Examples of schools using data to make informed decisions are included, along with explicit recommendations for success.
School profiles are provided to support literacy leaders with using the Data Wise Cycle: Prepare, Inquire, and Act.
Page 160 Adolescent Literacy Facilitator’s Guide
Module 3 Leadership Roles and Responsibilities
Table 2 provides a list of suggestions for how literacy leaders can support teachers, in their efforts to learn and apply the information in Module 3.
Action Steps Persons Responsible
Literacy leaders can establish a professional reading agenda to develop an understanding of how to use the multiple sources of data that inform decisions about teacher support and student learning needs. They can benefit from using the suggested resources in Table 1 to establish an organized reading calendar for the year. Some of the resources may be used by the Literacy Team to guide planning, but as a team, you may also see resources that will guide further understanding of the staff at large.
Understanding data is essential the successful implementation of instructional support that can improve student literacy learning. Using suggested professional reading resources to develop a common understanding of data use among the staff will be important to your efforts. Plan to use some of the protocols suggested in the Modules for reading and discussing these texts; for example,
Text Rendering Experience http://www.schoolreforminitiative.org/protocol/doc/text_rendering.pdf
,
Three Levels of Text Protocol http://www.schoolreforminitiative.org/protocol/doc/3_levels_text.pdf
Evaluate the type of data currently collected on instructional practices.
Does it provide enough information about instructional practices that support literacy learning? How do you use this data to better support teachers and, ultimately, student literacy learning?
Analyze all assessments given at the school. Categorize by type, i.e. screening, formative, diagnostic, progress monitoring, and summative. As a follow-up to this exercise, ask teachers to brainstorm as grade-level or departmental teams to determine what type of data they get from each assessment and how it helps them to teach content.
Use the Looking at Data Sets Protocol http://www.schoolreforminitiative.org/protocol/doc/looking_data_sets.pdf
to begin collaborative discussions about the type of data and information provided from current assessments.
Complete a full assessment audit to determine if there are assessments that provide adequate data about student literacy strengths and weaknesses. Are there assessments that are used for screening, formative, diagnostic, and progress monitoring?
Develop an assessment calendar that includes when results may be ready to share with the leadership team and the faculty.
Allocate resources to purchase additional assessments—if needed—and the professional development needed to administer and analyze the
X
X
X
Module 3: Assessment
X
X
X
X
X
X X
X X
Page 161
Module 3 Leadership Roles and Responsibilities results of any new assessment.
Provide adequate time in the schedule for the leadership team and teaching teams to analyze assessment results and make decisions with explicit action steps for improvement.
Develop a cadre of ‘experts’ on staff who are willing to support PLCs, or teams of content teachers, to interpret data and make instructional decisions based on the data. Additional training may be required to develop the expertise, but in-house experts are normally well-received by colleagues.
Action Steps
X
X
Persons Responsible
Provide sessions and assistance throughout the school year to dig more deeply into data. Use protocols suggested in the MA ESE District Data
Team Toolkit http://www.doe.mass.edu/sda/ucd/ddtt/Introduction.doc
and MA ESE Module 3 to:
Develop a vision for using data to improve instruction and literacy learning.
Develop a complete data inventory of literacy assessments.
Develop a specific focus question connected with literacy improvement.
Analyze data using the Data Analysis Protocol to determine strengths and areas for focused improvement.
Determine the problem that may be preventing literacy-based student success using the Why, Why, Why protocol.
Using the logic model, design specific action steps for improving literacy instructional practices and student literacy learning.
Determine a plan for progress monitoring and make adjustments as needed to improve literacy instruction and student outcomes.
Provide sessions for teachers, using in-house data experts or outside experts, around reading and understanding the results of specific assessments and what they reveal about student literacy learning needs within their content areas. Build on the information about assessments introduced in Module 3, which included disciplinary assessments, formative assessments, screening assessments, and progress monitoring assessments.
Within PLCs, ask teachers to analyze lesson plans. Ask them to consider these guiding questions: 1) Are there opportunities for students to read, write, listen, discuss, and investigate? 2) Is the gradual release of responsibility model embedded within the lesson plan? 3) Is there evidence of literacy strategies used that support the literacy demands of the content? 4) Do daily plans consistently provide for a variety of learning styles and thinking styles? It will be helpful to engage teachers with
X
Page 162
X
X
X
X
X
X
X
X
Adolescent Literacy Facilitator’s Guide
Module 3 Leadership Roles and Responsibilities developing a rubric to analyze lesson plans to use with this activity. Ask teachers to use their findings to provide feedback about additional support required to better design literacy rich-content lessons.
Provide teachers with additional support around developing informal classroom assessments that help them to better understand the specific literacy learning needs of the students. Additional support may be required for developing checklists and observation guides to identify students’ use of strategies, fluency with content text, skill with effectively using text structure, and vocabulary and comprehension skills. Use some of the material from Module 3 as background to begin discussions.
Suggestions include:
Chappuis, S. & Stiggins, R. (2009). Formative assessment and assessment for learning in Pinkus, 2009, Meaningful measurement.
Torgeson, J. K., & Miller, d. H. (2009) Assessments to guide adolescent literacy instruction. Portsmouth, NH: RMC Research Corp., Center on
Instruction, pp.3-17.
Support teachers in collaborating to look at student work samples in order to identify the specific literacy demands of the assignment and evaluate student literacy skills used to complete the assignment. Does the assignment embed opportunities to address MA Literacy targets?
Support activities suggested in Module 3 to be used/shared among the whole staff, e.g. collegial observations in Unit 1: Session 1, use of the
Assessment Knowledge/Confidence Survey in Unit 1: Session 2, and the comparison between good/poor assignments in Unit 1: Session 2.
X
Action Steps
X
X
X
X
X
X
X
X
Persons Responsible
Collect multiple sources of data including: student formal and informal assessment data, teacher data surveys, information from PLC teams, and literacy walks. Develop multiple data displays from intensive discussions and analysis using the protocols for the MA ESE District Data Team Toolkit. http://www.doe.mass.edu/sda/ucd/ddtt/Introduction.doc
Share data with teachers and begin the action planning process to address continued needs and focus for improvement for next school year.
Ensure that teachers providing support to struggling students get adequate professional development related to collection and use of diagnostic assessments and how to implement interventions with fidelity.
X X X X
Module 3: Assessment Page 163