Assessment Handbook 2006–2007

advertisement
Assessment Handbook
2006–2007
PENNSYLVANIA DEPARTMENT OF EDUCATION
Bureau of Assessment and Accountability
Division of Assessment
2006–2007 Science Assessment Handbook
TABLE OF CONTENTS
Introduction........................................................................................................................3
Part One: About the Science Assessment ........................................................................5
A. Overview of the PSSA ........................................................................................5
B. Test Highlights ..................................................................................................10
C. Small Mistakes on Open-Ended Responses ......................................................13
Part Two: Linking the PSSA to Classroom Practice....................................................14
A. Anchors-in-Practice ..........................................................................................15
B. Adopt-an-Anchor ..............................................................................................16
C. What Successful Schools Are Doing ................................................................17
Part Three: Frequently Asked Questions ......................................................................19
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
2
Introduction
In 2005, the Pennsylvania Department of Education (PDE) released a set of Assessment
Anchor Content Standards (Assessment Anchors) for Science. The Assessment Anchors
provide greater clarity about the assessment system and can be used by educators to help
prepare students for the Pennsylvania System of School Assessment (PSSA). In the
spring of 2007, Pennsylvania students will participate in a science assessment in the form
of a stand-alone field test. In the spring of 2008, students will be given the first
operational assessment. Teachers must understand that the Pennsylvania Academic
Standards should still be used to drive curriculum and instructional decisions throughout
the year. The purpose of this handbook is to provide teachers with the most up-to-date
information about the PSSA and help them to prepare students for the test.
This handbook has three sections:
Part One: About the Science Assessment provides a brief history of the PSSA and
describes what’s new this year along with specifics about the test blueprints and test
format.
Part Two: Linking the PSSA to Classroom Practice offers a set of strategies to help
teachers better prepare their students for the Science PSSA and how to align their
curriculum to the Assessment Anchors and Pennsylvania Academic Standards.
Part Three: answers frequently asked questions (FAQs) on the PSSA and the
Assessment Anchor Content Standards.
This handbook is one of many tools the Department of Education has developed to help
teachers better understand the assessment system. In addition to this handbook, you can
access the following on the PDE website at
http://www.pde.state.pa.us/a_and_t/site/default.asp?g=0&a_and_tNav=|630|&k12Na
v=|1141|.
Item and Scoring Sampler: The Science item and scoring sampler for grade 4 includes
ten multiple-choice items and one 2-point, open-ended item. The grade 8 sampler
includes nine multiple-choice items, one 2-point, open-ended item and one scenario with
four multiple-choice items. The grade 11 sampler includes nine multiple-choice items,
one 2-point, open-ended item and one scenario with four multiple-choice items and a 4point open-ended item. All open ended items in the sampler include item-specific scoring
guidelines and student work. They represent the types of items that may appear on the
2007 PSSA field test and the 2008 PSSA operational assessment. Teachers may use the
items in the classroom and for professional development purposes.
http://www.pde.state.pa.us/a_and_t/cwp/view.asp?a=108&Q=73314&a_and_tNav=|
680|&a_and_tNav=|
Accommodations Guidelines: 2003/04 was the first year the Department released a
comprehensive accommodations guide for all students, including students with
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
3
Individualized Education Programs (IEPs), students with 504-service agreements, and
English Language Learners. The Accommodations Guidelines were revised this year.
Check the website at
http://www.pde.state.pa.us/a_and_t/cwp/view.asp?a=108&Q=45132&a_and_tNav=|
678|&a_and_tNav=| for the revised 2007 Accommodations Guidelines for Students with
IEPs, Students with 504 Plans, English Language Learners and All Students. This
documents contain information about the accommodations and student eligibility.
Alignment Strategies: The Department provided teachers with two strategies to ensure
classroom practice is aligned to the Assessment Anchors: Anchors-in-Practice and
Adopt-an-Anchor. These strategies are described briefly in this handbook. For more
information on these strategies, contact your Intermediate Unit or refer to Get Ready, Get
Set, GO! Assessment Anchor Rollout PowerPoint Presentation at
http://www.pde.state.pa.us/a_and_t/cwp/view.asp?a=108&Q=103127&a_and_tNav=
|6309|&a_and_tNav=|.
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
4
Part One: About the Science Assessment
A. Overview of the Pennsylvania System of School Assessment (PSSA)
Like most state assessment systems, the PSSA has evolved over time. The following is a
brief description of the adoption of standards in Pennsylvania and the evolution of the
PSSA since 1998.
Chapter 4
On October 21, 1998, the State Board of Education adopted final-form regulations for the
new Chapter 4 of the Pennsylvania School Code.1 Upon conclusion of the regulatory
review process, Chapter 4 was published in the January 16, 1999, Pennsylvania Bulletin
as final rulemaking, binding in all public schools in the Commonwealth.
The new Chapter 4 replaced the previously adopted Chapters 3 and 5, and provided a new
direction for the PSSA. Beginning with the 1998–99 assessment, the PSSA became
standards-based. Beginning with the February/March 1999 testing, the entire PSSA had
to be aligned with the Pennsylvania Academic Standards. Standards were adopted for
Mathematics and for Reading, Writing, Speaking, and Listening
Act 16
In the year 2000, Act 16 (Senate Bill 652) was adopted, which redefined the PSSA as a
test developed and implemented by the Department of Education to determine only
academic achievement relating directly to objective academic standards in the areas of
Reading, Mathematics, and Science.
The State Board of Education adopted the Science and Technology Standards on July 12,
2001 and the Environment and Ecology Standards on January 5, 2002.
Academic Standards have been widely distributed and can be found on the Pennsylvania
Department of Education web site:
http://www.pde.state.pa.us/stateboard_ed/cwp/view.asp?a=3&Q=76716&stateboard
_edNav=|5467|.
Purpose of the PSSA
As outlined in Chapter 4, the purposes of the statewide assessment component of the
PSSA are as follows:
• Provide students, parents, educators, and citizens with an understanding of
student and school performance.
• Determine the degree to which school programs enable students to attain
proficiency of academic standards.
1
The link for §4.51 of the Pennsylvania Code is:
http://www.pacode.com/secure/data/022/chapter4/s4.51.html
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
5
• Provide results to school districts (including charter schools) and Area
Vocational Technical Schools (AVTSs) for consideration in the development of
strategic plans.
• Provide information to state policymakers, including The General Assembly and
the State Board, on how effective schools are in promoting and demonstrating
student proficiency of academic standards.
• Provide information to the general public on school performance.
• Provide results to school districts (including charter schools) and AVTSs based
on the aggregate performance of all students, for students with an IEP, and for
those without an IEP.
No Child Left Behind (NCLB)
Since the adoption of Chapter 4, federal legislation has required the State to make
adjustments in its assessment system. Chapter 4 called for the Statewide Assessment
System to include only grades 5, 8, and 11. On January 8, 2002 NCLB was signed into
law reauthorizing the Elementary and Secondary Act and requiring all states to develop
assessments in grades 3–8 and at least one assessment in grades 9–12 for Reading and
Mathematics. The Department began assessing grade 3 in 2003, and by 2005–2006
Pennsylvania had assessments in Reading and Mathematics for grades 3–8 and 11 as
required by NCLB. NCLB also requires all states to develop a Science test to be
developed and operational by 2007–2008.
NCLB does not require a separate Writing test, but per Chapter 4, Pennsylvania includes
a statewide Writing test at three grade levels. In 2005, the Writing PSSA shifted from
grades 6, 9, and 11 to grades 5, 8, and 11. All Writing assessments will be administered
in February each year. For more information on the Writing test, see the Writing
Highlights in the Assessment section of the PDE website.
Introduction of the Assessment Anchor Content Standards
The next evolution of the PSSA came in the introduction of the Assessment Anchor
Content Standards. The Assessment Anchor Content Standards (referred to as the
Assessment Anchors) clarify the standards assessed on the PSSA, and are designed to
hold together, or “anchor,” both the PSSA and the curriculum/instructional practices in
schools. Educators can use the Assessment Anchors to prepare their students for the
PSSA. The Assessment Anchors better align curriculum, instruction, and assessment
practices throughout the state. Without this alignment, we cannot significantly improve
student achievement in the Commonwealth. The introduction of standards has been
critical to improving student achievement in Pennsylvania. Over the last few years,
however, teachers have expressed a need for a clearer document.
The Department of Education identified the Assessment Anchors based on the
recommendations of teachers serving on the Science Assessment Anchor Committee and
other curriculum experts. The Department also looked to national organizations (i.e.,
NSTA, NCIEA, and NAEP) and other external groups to ensure that the Anchors are:
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
6
•
•
•
•
•
•
Clear: The Department wanted to clarify which standards are assessed on the
PSSA. The Anchors should be easy to read and user friendly.
Focused: Rather than have teachers guess which standards are most critical,
the Anchors identify a core set of standards that could reasonably be assessed
on a large-scale assessment.
Aligned: The focus is on helping students achieve the state’s standards. The
Anchors align directly to the state’s standards in Reading, Mathematics, and
Science and give focus and clarification to the standards.
Grade Appropriate: Teachers may have different ideas about what skills
should be mastered by which grade levels. The Anchors provide examples of
skills and knowledge that should be learned at the different grade levels and
be assessed on state assessments.
Rigorous: The Department has maintained the rigor of the state standards
through the Anchors. In addition, the State will continue to use open-ended
items on the PSSA to assess higher-ordered reasoning and problem-solving
skills.
Manageable: The Department identified a set of standards that could be
taught in a manageable way before the spring administration of the PSSA.
Like the standards, the Anchors will be reviewed periodically to ensure they represent the
most important skills and knowledge that should be assessed on the PSSA. For a copy of
the Assessment Anchors in Science, see the PDE website at:
http://www.pde.state.pa.us/a_and_t/cwp/view.asp?a=108&q=103127&a_and_tNav=|
6309|&a_and_tNav=|.
This is the third year that the state is using the Assessment Anchors as a basis for the
PSSA in Mathematics and Reading. The Department has received overwhelming positive
feedback on the usefulness of the Assessment Anchors, and, therefore, a more focused
assessment system that is aligned to the standards and tightly tied to curriculum and
instruction.
The Assessment Anchor Content Standards were critical in designing the grades 4, 8, and
11 assessments in Science because of the two-document set of standards for Science. The
Assessment Anchors clarify what students are expected to know at the end of grades 4, 8,
and 11 for the assessment. Teachers can see how concepts build from grade span to grade
span.
How to Read the Assessment Anchors
All of the Science Assessment Anchors begin with an “S” to indicate science. The
number after the “S” in the label is the grade level (e.g., S8 would be Science at eighth
grade). The second letter in the labeling system is the Reporting Category (A through D)
followed by the sub-reporting category number. The same reporting categories continue
across grades 4, 8, and 11. The next number in the label is the actual Assessment Anchor
number (e.g., 1.1, 1.2, 1.3, etc.). Essentially, you read the Assessment Anchors like an
outline. (See example on following page.)
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
7
S8.D.1.1
Science
Assessment
Anchor
Grade
Level
Reporting
Category
SubReporting
Category
For example, S8.D.1.1 is broken down as follows:
S = Science
8 = Grade 8
D = Earth and Space Sciences (Reporting Category)
1 = Earth Features and Processes that Change Earth and Its Resources (Sub-reporting
Category)
1 = Describe constructive and destructive natural processes that form different geologic
structures and resources. (Assessment Anchor)
In addition to the Assessment Anchors, other changes included:
•
Reporting Categories: The Anchors are organized into four Reporting
Categories. These categories are similar to the NAEP (National Assessment of
Educational Progress) Reporting Categories. PA Standard Statements were
examined, and most were placed in the appropriate Reporting Categories. The
following table illustrates the connections between the Standards and the four
Reporting Categories.
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
8
Reporting
Connections to the Standards
Categories
Science and Technology
Ecology and the Environment
4.4 Agriculture and Society
A. The Nature of 3.1 Unifying Themes of Science
4.6 Ecosystems and their Interactions
3.2
Inquiry
and
Design
Science
B. Biological
Sciences
C. Physical
Sciences
D. Earth and
Space
Sciences
3.6 Technology Education
3.7 Technological Devices
3.8 Science, Technology, and Human Endeavors
3.1 Unifying Themes of Science
3.3 Biological Sciences
3.2 Inquiry and Design
3.4 Physical Science, Chemistry, and Physics
3.6 Earth Sciences
3.2 Inquiry and Design
3.4 Physical Science, Chemistry, and Physics
3.5 Earth Sciences
3.7 Technological Devices
4.7 Threatened, Endangered, and
Extinct Species
4.8 Humans and the Environment
4.2Renewable and Nonrenewable
Resources
4.3 Environmental Health
4.5 Agriculture and Society
4.6 Ecosystems and their Interactions
4.7 Threatened, Endangered, and
Extinct Species
4.1 Watersheds and Wetlands
4.2 Renewable and
Nonrenewable Resources
4.8 Humans and the Environment
•
Item-Specific Scoring Guidelines: The Science items are scored with itemspecific scoring guidelines. The stand-alone open-ended items will be written to a
0–2 scale. The scenario open-ended items are written to a 0-4 scale.
Keep in mind that the scoring guidelines do not equate with the four performance
Levels -Advanced, Proficient, Basic, and Below Basic. These performance levels
describe a student’s overall performance and should not be confused with the
point scale on the scoring guideline for open-ended items.
•
Open-Ended Items: The Science stand-alone open-ended items are about 5-10
minutes in length and are written so there is more than one approach to correctly
answering the item. The scenario open-ended items are about 15-20 minutes in
length.
Alignment to the Standards: All of the Science Standards categories are still
included on the PSSA. The Assessment Anchors tightened the focus of what is
assessed and at which grade level. Each Assessment Anchor includes references
to the Pennsylvania Academic Standard(s) it addresses.
Rigor: The Assessment Anchors have maintained the rigor of the PSSA. The
Anchors have simply clarified and focused the assessment. During the item
review process, educator committees evaluate items using Webb’s Depth of
Knowledge, as well as identify the item difficulty (“easy, medium, or hard”) for
the appropriate grade level assessed. For more information on Depth of
Knowledge, see http://www.wcer.wisc.edu/WAT/index.aspx.
•
•
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
9
•
Educator Committees: Educator committees continue to play a critical role in
the development of the PSSA through a variety of meetings. Educator committees
review all items and ensure rigor, alignment, and grade appropriateness.
Educators also participate in bias and sensitivity reviews.
B. Test Highlights
This section describes some basic information about the PSSA including the testing
window, length of the tests, test format (e.g., common versus matrix items), and types of
questions that will appear on the test.
Test Blueprint
The test blueprint gives information on the standards (and now Assessment Anchor
content Standard Reporting Categories) measured on the PSSA, including relative
weights and the number of points assigned to each Anchor category. Analyzing the
blueprint can give insight on how to prepare students to meet weighted expectations set
by the PDE and the PSSA Advisory Committees. The 2008 test blueprint provides
information on both common and matrix items.
2008 PSSA
Grades 4, 8, and 11 Science Assessment Blueprint*
Reporting
Category
The Nature of
Science
Biological
Sciences
Physical
Sciences
Earth and
Space Sciences
Total Points
Grade 4
Grade 8
Grade 11
33
33
36
11
11
12
11
11
12
11
11
12
66 pts.
66 pts.
72 pts.
*At each grade level, the table shows the number of points to be devoted to each reporting
category. The exact percentage for a reporting category will depend upon where the open-ended
items are placed. At grades 4 and 8, ten of the total points will come from 5 open-ended items,
each scored from 0 to 2 points. At Grade 11, 24 of the total points will come from 6 open-ended
items, each scored from 0 to 2; and 3 open-ended items, each scored from 0 to 4.
2007 Testing Windows
In the past, the Reading and Mathematics PSSA were given in grades 3, 5, 8, and 11. This
year, the assessment will again be given in those grades; however, the PDE will also
administer the PSSA for grades 4, 6, and 7 in Mathematics and Reading. The Science
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
10
field test will also be given in 2007 and will be administered to grades 4, 8, and 11. The
following is the calendar for the 2007 testing windows:
2007 Dates
February 12–23, 2007
March 12 -23, 2007
Assessment
Writing Assessment
Grades 5, 8, and 11
Reading Assessment
Grades 3–8 and 11
Mathematics Assessment
Grades 3–8 and 11
April 23- May 4, 2007
Science Assessment
Grades 4, 8, and 11
Information
• Operational test with an embedded field
test.
• Operational test.
• All grades will be used to determine if
schools meet AYP for NCLB.
• All students/schools participate.
• Embedded field test.
• Stand-alone field test
The PSSA dates for the spring administration are selected in an effort to avoid conflict
with as many spring breaks as possible.
2008 Approximate Length of Testing Time
The PSSA measures students’ ability to meet the Assessment Anchors regardless of how
much time it takes. Thus, the PSSA is not a timed test; every student receives extra time
if needed. However, there are certain conditions that test administrators must observe to
give students extra time. These guidelines are described in the assessment administration
manuals and the current version of the Accommodations Guidelines. The most important
condition is that any extra time must come immediately after the testing period and not
after lunch and/or the next day. The only exceptions on spacing the test sections are for
students with accommodations written into their IEPs.
The estimated time to administer each Operational Science test is as follows:
Grade Level
Grade 4
Grade 8
Grade 11
Science
Number of Test Sections
2 sections
2 sections
3 sections
Approximate Testing Time
95-100 minutes
105-110 minutes
155-160 minutes
2008 Common Versus Matrix Items
There are several different forms of the test, so students may not all be taking the same
version. Each form has both common and matrix items. The common items appear on
each form for every student to complete. On all PSSA assessments, only common items
are used to determine student scores and proficiency levels.
Matrix items vary from form to form. Matrix items are used to give additional
information to the school about student performance. Because Pennsylvania wants
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
11
students to do their best on all items, the students are not able to distinguish which items
are common, field test, or matrix on the test. Please keep in mind there will be an
embedded field test for all grades. Also, keep in mind that students must answer at least
five items in every section to receive a score.
2008 Types of Questions
The Science PSSA has both multiple-choice and open-ended items with grade 8 and grade 11
having scenarios. A scenario is a content-rich passage that requires students to use both content
knowledge and science process knowledge to answer complex problems by using graphics (e.g.,
graphs, charts, tables, diagrams, illustrations) to support the scenario text. The scenario items are
specifically aligned to eligible content that Pennsylvania educators developed. When answering
the items associated with scenario stimuli, students are required to use their content knowledge
and science process knowledge. Some of the eligible content requires that students pull pertinent
information from the scenario and others require that students use prompts from the scenario to
answer content-based questions. Because responding appropriately to scenario questions requires
content knowledge, students will need more than good reading comprehension skills to answer
these correctly. In grade 8 there will be one scenario that will be answered with multiple-choice
items. In grade 11 there will be three scenarios that will be answered with multiple-choice and 4point open-ended items. Students need to know how to respond well to all types of items,
including scenarios, to score at the proficient level on the PSSA. The following table displays the
Science test design that includes the number and types of questions:
2008 PSSA Science Test Format
Number of Multiple-Choice
Items Per Student
Grade
Estimated
Length of
Test
Total
Number of
Points
(Common
Items Only)
Number of Open-Ended
Items Per Student
Number
of
Common
Items
Number
of
Common
Scenario
Items
Number
of
Matrix
Items
Number
of
Field
Test
Items
Number
of
Common
Items
(2 points
each)
Number
of
Common
Scenario
items
(4 points
each)
Number
of
Matrix
Items
Number
of
Field Test
Items
Number
of
Score Points
per Student
4
95-100
minutes
2 sections
56
0
3
3
5
0
1
1
56+10=66
8
105-110
minutes
2 Sections
52
4
3
7
5
0
1
1
56+10=66
11
155-160
minutes
3 sections
36
12
5
8
6
3
0
2
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
48+12+12=
72
12
Multiple-choice questions are also termed “selected response” because students choose
their answers from among those provided. Such items are an efficient means of assessing
a broad range of Anchors. In the PSSA Science Assessment, all multiple-choice items
have only one correct response choice, and the student is awarded one point for choosing
it. For all multiple-choice items students are provided with four choices.
•
The Use of Other Tools in the Assessment: When responding to PSSA Science
items, students are not permitted to use Science books, dictionaries, PDAs, or
reference materials of any kind. If such materials were allowed, students could
use them, for example, to provide a correct answer to certain questions by looking
up a definition (e.g., photosynthesis) and then applying it. Students may use
highlighter pens during the test sessions to highlight text. However, students must
use a No. 2 pencil to mark their responses to questions. Only highlighter pens may
be used; other types of felt-tip marking pens may not be used. A Department of
Education contractor conducted a study and made the following recommendations
if highlighters are used during test administration.
Students should be instructed never to use a highlighter in the same place that they use a
pencil.
• Highlighted pencil marks of any kind, including filled-in bubbles or
students’ handwriting, may cause pencil marks to blur or bleed, which
impacts machine scoring.
• Do not allow students to highlight barcodes, pre-slugged bubbles or any
other marks or printing around the edges of the scannable document. The
highlighters may cause the ink used for these purposes to blur and bleed.
• Use a highlighter from the following list. These highlighters have been
tested and found to cause minimal problems with scanning:
o Avery Hi-Liter
o Avery Hi-Liter, thin-tipped
o Bic Brite-Liner
o Sanford Major Accent
o Sanford Pocket Accent, thin-tipped
C. Small Mistakes on Open-Ended Responses
The following list shows some of the ways students lose points on open-ended responses.
Students should regularly check over their responses, both in the classroom and on the
PSSA, to make sure they have not made mistakes similar to the following:
Complete Responses
Scorers cannot infer understanding from student responses. All responses to the test
questions must be answered according to the directions. If part of a test question
asks for a written example of the reason something is done, then a scientific formula
alone, for example, will not meet the requirements of a complete response.
See the open-ended examples in the Item Samplers on the PDE web site below:
http://www.pde.state.pa.us/a_and_t/cwp/view.asp?a=108&Q=73314&a_and_tNav=|
680|&a_and_tNav=
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
13
Part Two: Linking the PSSA to Classroom Practice
The PSSA is designed to improve instruction—focusing lessons on students’ learning
needs, deepening rigor, and modeling clear expectations. There are times when, because
of a school’s performance or a sub-population’s performance, an increased emphasis
should be placed on the blueprint of the PSSA, in order to meet the needs of all students.
However, the Department supports the use of multiple assessments and the development
of a curriculum that goes beyond just meeting the Assessment Anchors.
Two strategies were developed to help educators align their curriculum and instructional
practices to the Assessment Anchors: Anchors- in-Practice and Adopt-an-Anchor. This
section describes these strategies and summarizes their success in schools across the
Commonwealth. You can also contact your Intermediate Unit for more information or
further training on these strategies.
Knowing Your Students
The PSSA measures student progress and provides evidence of student learning. The
PDE also views the PSSA as an important tool for decision-making. The PSSA and
classroom assessments provide information about students’ strengths and weaknesses,
and allow teachers to monitor progress throughout the school year. The PSSA provides
the annual snapshot, and the classroom assessments provide more frequent samples of
student progress. Student data helps teachers and administrators:
•
•
•
•
•
align instruction and curriculum to student needs
make appropriate instructional choices
provide appropriate support
select appropriate materials and programs
focus on how students learn
Professional Development
Quality professional development helps teachers grow individually and as a team. It is
paramount in building a professional community focused on student learning. To meet
the challenge of educating all students, teachers should have the following:
•
•
•
•
Ongoing, on-site professional development and opportunities to learn, share, and
discuss the work of teaching.
Recognition of both collective and individual teachers’ professional needs and
strengths.
A thorough understanding of the Assessment Anchors, how they are translated
into practice, and also what grade-level proficiency looks like.
Opportunities to plan, design, teach, evaluate, and score student work together.
The link between student achievement and effective teaching has been well documented.
Among the indicators (e.g., student work, district assessments), the PSSA provides
insight into the areas in which teaching and learning should be strengthened and
supported—across the district, in schools, in classrooms and for the individual student.
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
14
District and school leadership should use the Assessment Anchors and PSSA scores to
help guide them in choosing professional development opportunities.
A. Anchors-in-Practice
Adapted from the Education Trust’s Standards in Practice, Anchors-in-Practice focuses
on professional development and teacher training. The model is designed to elicit
discussion about the alignment among the Anchors, the PSSA Science Assessment, and
classroom practice. A more complete version is available on the Education Trust website
at www.edtrust.org.
Getting Started
To facilitate this exercise, teachers will need copies of the Assessment Anchors in
Science and one of the following: an assignment or problem (preferably a short one), a
PSSA released task, or a released task from another state or from NAEP. The exercise
should be facilitated with teams of teachers to prompt discussion and begin to align
individual expectations, understanding, and work.
Step One: Do the assignment or key parts of it.
Step Two: Analyze the assignment for its expectations. List the content and skills
the students must draw on or know to complete the assignment. Demands should
be embedded in the directions and the actual task.
Step Three: Match the expectations to the Anchor(s) that best align(s) with the
assignment. Look for the Anchor at the grade level that best describes the skills
and content inherent in the assignment. Try to prioritize the Anchors if there is
more than one. (If there are no Anchors that match or if the Anchor match is at a
lower or higher grade, discuss the implications. If so, rewrite the assignment to be
grade appropriate and align with the grade Anchor.)
A fourth step can be added if time allows:
Step Four: Using your list of expectations and the language in the Anchor(s),
write a statement of “proficiency” that relates to what students must do and
demonstrate in their work to receive a passing grade.
Working with Anchors-In-Practice helps teachers realize how expectations differ and
how assignments are not always aligned to the most essential skills and knowledge
represented in the Assessment Anchors.
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
15
B. Adopt-an-Anchor
All students need to refine and practice their scientific skills. As curriculum becomes
more demanding and materials become more complex, application of scientific skills can
be difficult even for proficient students. Thus, all students need more time and experience
applying their scientific skills and making scientific decisions. Middle and high school
students who struggle with Science must be given ample opportunity to strengthen their
skills. This can happen if other departments and their staff “adopt” carefully matched
Science Anchors from the Pennsylvania Assessment Anchor document. The Adopt-AnAnchor strategy seeks to achieve the following:
•
•
•
Familiarize staff with the Assessment Anchors and standards.
Deepen the instructional experiences in Science among various subject areas.
Allow staff to share responsibility of teaching essential skills in Science.
Adopt-an-Anchor first identifies Anchors that are appropriate for each department and/or
course. Teachers then “adopt” one or more Anchors, accepting responsibility for teaching
all students the knowledge and skills related to the chosen Anchor(s). Thus, staff that
adopt anchors agree to strategically teach and assess student learning on these selected
anchors until students perform well on these anchors as measured by PSSA-like
assessment formats. While the instructional strategies may vary with the subject, and
assessment of proficiency may take many forms, staff who adopt Anchors agree to teach
students to be proficient on state assessments. The key to this strategy is selecting
Anchors that are appropriate for specific subjects and/or courses, and then to accept
responsibility for teaching students to be proficient, as the Anchors will be assessed on
the PSSA.
The following examples illustrate the Adopt-an-Anchor strategy:
•
The middle school Science department could adopt the Science Assessment
Anchor S8.A.1.1, which states that students should be able to explain, interpret,
and apply scientific, environmental, or technological knowledge presented in a
variety of formats. The Science department could then develop a Science task,
such as “Based on the following graph from 2000–2004, what is your prediction
that Anycity, USA, can expect a flu epidemic in the coming year? What might be
some influencing factors?”
•
The elementary Art teacher could adopt the Science Assessment Anchor
S4.D.1.1: Describe basic landforms in Pennsylvania. The teacher could link this
to the Geography standard involving the physical characteristics of places and
regions. The teacher could then develop a Grade 4 art project that relates to both
statements by having students work, create, or draw the different landforms of
Pennsylvania.
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
16
C. What Successful Schools Are Doing
Some schools have found that the following strategies improve student achievement in
Science.
Elementary
Develop appropriate scientific skills beginning in pre-kindergarten and kindergarten.
• Diagnose student instructional needs to provide appropriate instruction. This
includes a careful analysis of content areas, sub-populations, and problemsolving skills.
• Use pedagogical methods common throughout the school so the students can
easily move from one classroom to another and apply the same skills. Science
and elementary teachers should carefully review the Science Assessment Anchor
Content Standards to make sure that what they are teaching is vertically
articulated.. All other teachers should make sure to incorporate the science
content appropriate for their area as defined by the Science assessment anchors.
The teachers could employ the Adopt-an-Anchor or Anchors-in-Practice strategy
to accomplish this. The most recent TIMSS study dealt with pedagogy in addition
to student performance.
• Adopt a systemic approach, either through a common program or a textbook
series. The approach should be research-based. Project 2061 from AAAS, the
National Science Foundation, and the regional math centers have resources on
research-based information.
• Use lessons that focus on developing children’s science vocabulary and engaging
students in developing a science vocabulary handbook. (Teachers of grades 3, 8,
and 11 can print out the Assessment Anchor glossaries found on the PDE
website. These glossaries include vocabulary words and the definition that may
be used on the PSSA.)
• Use discussion-based techniques to clarify and model how to approach science
problems in a variety of ways and using a variety of strategies.
• Offer instruction on the proper use of the calculator as part of the students’
ongoing development of the use of technology in relation to science.
• Focus professional development on the teaching of science.
• Monitor student progress using a variety of methods including those previously
mentioned as well as Diagnostic Classroom Assessment.
• Incorporate problem-solving and open-ended questions into the science
curriculum throughout the year.
• Have staff work with students to develop scoring guidelines for open-ended
problems so that they better understand the expectations.
• Have staff work with each other in developing scoring guidelines for open-ended
problems so that there is a common expectation among teachers within a school.
• Provide students with time to solve open-ended problems individually, before
sharing in small groups and discussing as a class.
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
17
•
Have staff work with students to develop students’ understanding of the
differences between the verbs used in open-ended items. Students should
understand the verbs “explain,” “describe,” and “identify.”
Secondary
Many of the elementary suggestions also apply at the secondary level.
• Have all staff accept responsibility for teaching scientific skills and monitoring
progress.
• Address science in all subject areas inside their content and curricula.
• Maximize the time available for teaching.
• Diagnose student instructional needs at the secondary level in order to provide
appropriate instruction. This includes a careful analysis of content areas, subpopulations, and problem-solving skills. The only difference between secondary
and elementary is that one should also examine the sub-population of vocational
students even though it is not included in NCLB.
• Use data and other diagnostic methods to shape curriculum.
• Use data and diagnostic methods to know the students, understand their scientific
proficiencies, and support and intervene where necessary.
• Provide professional development in the teaching of science for all staff; common
strategies are used in core and elective classes.
• Offer professional development in how to make adaptations for special needs
students and other special categories, such as ESL students.
• Partner Science with Mathematics, Reading and Writing.
• Plan and articulate a science-rich curriculum. The PSSA should not be the only
source for guidance. The International Math and Science Study (TIMSS),
Stanford Achievement Test (SAT), American College Test (ACT), National
Assessment of Educational Progress (NAEP), and other district sponsored
standardized or locally developed tests can also be used.
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
18
Part Three: Frequently Asked Questions
The PSSA
Q: What is the Pennsylvania System of School Assessment (PSSA)?
A: The PSSA is a state assessment in Mathematics, Reading, Writing, and Science given
each year to Pennsylvania’s public school students to measure students’ achievement of
the Pennsylvania Assessment Anchors. Mathematics, Reading, and Writing are current
programs. PSSA for Science will have its first operational administration in the spring of
2008.
Q: Who creates/develops the assessment?
A: A number of different groups are involved in the development of the PSSA. The
questions on the assessment have been developed by testing contractors and are reviewed
and revised by Pennsylvania teachers. A Technical Advisory Committee, composed of
some of the leading assessment experts in the country, assists in the format of the
assessment.
Q: Who must take the assessment?
A: Mathematics and Reading Assessments are administered to all public school students
in grades 3–8 and 11; the Writing Assessment is administered to all public school
students in grades 5, 8, and 11. The Science Assessment will be administered in grades 4,
8, and 11. Private school students may participate if the private school volunteers to
participate. Home-schooled students may also volunteer to participate. Parents of homeschooled children should contact their local districts to arrange for inclusion in the
testing.
Q: Can a parent or guardian review the assessment?
A: Parents or guardians may arrange with their school to review the PSSA before the
administration of the assessment to ensure that there are no conflicts with religious
beliefs.
Q: Can a parent or guardian opt a child out of the PSSA?
A: Yes, but only for religious reasons. After review of the PSSA, if a parent or guardian
chooses to opt his/her child out of participation in the PSSA because of a conflict with
religious belief, the parent or guardian must make such a request in writing to the
superintendent.
Q: What are the consequences of not participating in the assessment?
A: Because the PSSA is designed to measure a student’s attainment of the Pennsylvania
Assessment Anchors, non-participation prevents the student from seeing how well he/she
has achieved the Anchors. Also, non-participation affects a school’s overall results. If a
significant percentage of students fail to participate in the assessment, a true picture of
the school will not be presented. Finally, federal legislation makes it mandatory for each
school to have at least 95% of its population participate in the state assessment, or it will
fail to make adequate yearly progress (AYP).
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
19
Q: Must special education students participate in the assessment?
A: Yes, all students are required to participate. Students who have significant disabilities
participate in an alternate assessment (PASA). Special education students participating in
the regular assessment are to be provided accommodations as outlined in both the
student’s IEP and the PSSA Accommodations Guidelines that can be found at
http://www.pde.state.pa.us/a_and_t/cwp/view.asp?a=108&Q=45132&a_and_tNav=|
678|&a_and_tNav=|.
Q: Must LEP/ELL students participate in the assessment?
A: The U.S. Department of Education released guidance on participation of LEP students
in state assessments. This flexibility allows LEP students in their first year of enrollment in
U.S. schools, not including Puerto Rico, the option of taking the Pennsylvania System of
School Assessment (PSSA) Reading assessment. If students choose to participate, their
performance level results will not be included in the AYP calculations for the
school/district. All LEP students are now required to take the ACCESS for ELLs English
language proficiency assessment, and all LEP students are still required to participate in
the Mathematics and Science assessments, with accommodations as appropriate. However,
the Mathematics scores of LEP students in their first year of enrollment in U.S. schools,
not including Puerto Rico, will not be used to determine AYP status. For additional
information, see the 2007 Accommodations Guidelines at
http://www.pde.state.pa.us/a_and_t/lib/a_and_t/2007AccommodationsGuidelines.pdf
Q: Must my child perform at a certain level on the PSSA to graduate?
A: §4.24 of Chapter 4 states that each school entity [district, including charter
schools,] shall specify requirements for graduation in the strategic plan under § 4.13
(relating to strategic plans). Requirements shall include course completion and grades,
completion of a culminating project and results of local assessments aligned with the
academic standards. Beginning in the 2002-2003 school year, students shall demonstrate
proficiency in reading, writing and mathematics on either the State assessments
administered in grade 11 or 12 or local assessment aligned with academic standards and
State assessments under § 4.52 (relating to local assessment system) at the proficient
level or better to graduate. The purpose of the culminating project is to assure that
students are able to apply, analyze, synthesize and evaluate information and communicate
significant knowledge and understanding.
Q: What is meant by “proficient”?
A: Pennsylvania has identified four levels of performance: The Advanced Level reflects
superior academic performance. Advanced work indicates an in-depth understanding and
exemplary display of the skills included in the Assessment Anchors. The Proficient
Level reflects satisfactory academic performance. Proficient work indicates a solid
understanding and adequate display of the skills included in the Assessment Anchors.
The Basic Level reflects marginal academic performance. Basic work indicates a partial
understanding and limited display of the skills included in the Assessment Anchors. This
work is approaching but not reaching satisfactory performance. There is a need for
additional instructional opportunities and/or increased student academic commitment to
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
20
achieve the Proficient Level. The Below Basic Level reflects inadequate academic
performance. Below Basic work indicates little understanding and minimal display of the
skills included in the Assessment Anchors. There is a major need for additional
instructional opportunities and/or increased student academic commitment to achieve the
Proficient Level.
Q: How were performance levels developed?
A: The Pennsylvania Department of Education used the statistical standard-setting
procedure called the Modified Bookmark Method. Pennsylvania teachers, higher
education representations, and members of educational and assessment organizations,
such as the National Center for the Improvement of Educational Assessment, examined
the PSSA booklets with questions ordered from easiest to hardest. Based on their
experience in determining students’ abilities at different levels, they made determinations
of advanced, proficient, basic, and below basic by placing a “bookmark” at the point in
the booklet that best represented each level.
Q: Where can I obtain more information on the PSSA?
A: From the PDE website:
http://www.pde.state.pa.us/a_and_t/site/default.asp?g=0&a_and_tNav=|630|&k12Na
v=|1141|
The Assessment Anchor Content Standards
Q: What are Assessment Anchor Content Standards?
A: The Assessment Anchor Content Standards are one of the many tools the
Pennsylvania Department of Education has developed to better align curriculum,
instruction and assessment practices throughout the state. The anchors provide focus and
clarify to the standards assessed on the PSSA and can be used by educators to help
prepare their students for the PSSA. The anchor metaphor is intended to signal that the
Assessment Anchors would anchor both the state assessment system and the
curriculum/instructional practices in schools.
Q: Why do we need Assessment Anchor Content Standards if we already have the
Pennsylvania Standards?
A: Teachers across the Commonwealth have been using the state standards to develop
curriculum and instructional materials. Likewise, the Department and teacher committees
have been using the standards to develop the state assessments. Over the last few years,
however, teachers have expressed a need for a clearer, more focused, document, noting
that the Pennsylvania Academic Standards were often too broad and too numerous. The
Assessment Anchor Content Standards enable the PSSA to have a higher level of clarity.
Q: Do the Assessment Anchor Content Standards replace the Pennsylvania
Standards?
A: No. The Assessment Anchor Content Standards do not replace the Pennsylvania
Academic Standards. All teachers are still required to teach to all of the academic
standards per Chapter 4 regulations and use local assessments to measure student
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
21
progress. The content standards (anchors) clarify which academic standards are assessed
on the PSSA.
Q: Will teachers teach only the Assessment Anchor Content Standards and ignore
other knowledge and skills?
A: The Assessment Anchors were written with the intent of having interdisciplinary
discussions about how the Mathematics and Reading Anchors can be taught in Science,
Social Studies, the Arts and other content areas. The intent of the Anchors is not to
narrow the curriculum, but to focus teachers on the essential skills and knowledge in
Reading, Mathematics, and Science that must be taught across the curriculum, given the
limited amount of time teachers have with students.
Q: How were the Assessment Anchor Content Standards developed and by whom?
A: The Department of Education developed the Assessment Anchors based on the
recommendations of teachers serving on the Mathematics, Reading, and Science
Assessment Advisory Committees and other curriculum experts. The Department also
looked to national organizations (i.e., the National Council of Teachers of Mathematics
(NCTM), the National Council of Teachers of English (NCTE), the National Center for
the Improvement of Educational Assessment (NCIEA), and the National Assessment of
Educational Progress (NAEP)) and other external groups for input. The PDE had seven
criteria in mind with the development of the Anchors. The Department wanted the
Assessment Anchor Content Standards to be:
•
Clear: Clarification of which standards are assessed on the PSSA. The Anchors
should be easy to read and user friendly.
•
Focused: Rather than have teachers “guess” which standards are most critical, the
Anchors identify a core set of standards that could reasonably be assessed on a
large-scale assessment.
•
Aligned: The Anchors align directly to the state’s standards in Reading,
Mathematics, and Science and simply give focus and clarity to the Academic
Standards.
•
Grade Appropriate: Teachers may have different ideas about the grade level at
which skills should be mastered. The Anchors provide clear examples of skills
and knowledge that should be learned at the different grade levels and that will be
assessed on state tests.
•
Organized To Support A Curricular Flow: Rather than simply identifying
Anchors in the grades for which the state has standards, The PDE developed
Assessment Anchors in grades 3–8 and 11 to encourage a curricular spiral that
builds each year to the next.
•
Rigorous: The Department has maintained the rigor of the state academic
standards through the anchors. In addition, the State will continue to use open-
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
22
ended items on the PSSA to assess higher-order reasoning and problem-solving
skills.
•
Manageable: The PDE wanted to identify a set of standards that could be taught
in a manageable way before the spring administration of the PSSA.
Q: How are the Assessment Anchor Content Standards organized?
A: The following can be found in the handbook:
• Reporting Category:
The Anchors are organized by Reporting Categories. The Reporting Category
appears at the very top of each page. There are five Reporting Categories in
Mathematics, two Reporting Categories in Reading, and four Reporting
Categories in Science. Reporting Categories are important because individual
student scores will be reported at this level. District and school reports may
include reports by Assessment Anchor if there are enough questions on the
PSSA to warrant a valid score by the broad Anchor statement.
• Subcategory:
In Science, the sub-reporting category is listed below the Reporting Category.
• Assessment Anchor:
The column on the left-hand side of the page is the Assessment Anchor.
• References:
Below each Assessment Anchor is a reference in italics. This reference relates
to the Pennsylvania Academic Standards and helps connect the Anchors to the
Standards.
• Eligible Content:
The column on the right-hand side of the page beside each Assessment Anchor
is the Eligible Content. This is often known as the “assessment limits” and
helps teachers identify how deeply they need to cover an Anchor and/or the
range of the content they should teach to prepare their students for the PSSA.
Not all of the Eligible Content is assessed on every form of the PSSA, but it
shows the range of knowledge from which the test is designed.
Q: How can teachers, schools, and districts use the Assessment Anchor Content
Standards?
A: The Assessment Anchors can help focus teaching and learning because they are clear,
manageable, and closely aligned to the PSSA. Teachers and Administrators will be better
informed about which standards will be assessed on state tests. The Assessment Anchors
should be used in combination with the assessment handbooks that include the test
blueprints of the PSSA. With this degree of information, teachers can more easily embed
these skills and knowledge in the larger curriculum. Elective and support staff can also
“adopt” an Assessment Anchor. In this way, an entire school and community can teach
and reinforce these critical Reading, Mathematics, and Science standards.
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
23
Q: What is the difference between the Assessment Anchor Content Standards and
“anchor papers”?
A: Anchor papers are not the same as the Assessment Anchors. To score open-ended
items on the PSSA, Pennsylvania teachers read a sampling of the student responses on
the open-ended items and try to identify responses or “papers” that exemplify the
different score points on the scoring guideline. These responses are called “anchor
papers.” They are called anchor papers because they “anchor” the scoring process and are
used to train scorers. When the Department releases open-ended items with student work,
the anchor papers are often released with the items.
Q: Will the anchors ever be revised or changed?
A: Like the standards, the Anchors are reviewed periodically to ensure that they represent
the most important skills and knowledge that should be assessed on the PSSA.
Pennsylvania Department of Education, Science Assessment Handbook, November 2006
24
Download