PVAAS PAIU_PDE_March 9 2011

advertisement

Pennsylvania Value-Added

Assessment System (PVAAS):

PVAAS Public Data Release

Update to PDE and PAIU

2 Sessions

March 9 and 10, 2011

Kristen Lewald, Ed.D.

PVAAS Statewide Team for PDE

Lancaster-Lebanon IU 13

Today’s Session

PDE and PAIU Executive Director’s

Meeting was held last Friday

Questions from public and districts/schools

Provide an update to PAIU

• PDE joining us today as well

Agenda

Achievement and Progress

Available Data & Crosswalk

Public Website Demonstration

Myths and Concerns: Feedback

Suggested Resources/Supports for

Communicating PVAAS

Questions

Achievement vs. Progress

Achievement

The final result of an academic experience

Highly correlated with demographic factors, such as socioeconomic status

Progress

Is the concept underlying valueadded analysis and reporting

Not correlated with demographic factors

Affected by factors outside the school

Dependent upon what happens as a result of schooling

ValueAdded is…

A statistical analysis used to measure a district’s or school’s impact on the academic progress rates of groups of students from year to year.

Conceptually, a growth measure is approximately the difference between current achievement (current results) and prior achievement (prior results) with achievement being measured by an appropriate assessment, such as the PSSA.

However, PVAAS is NOT a simple comparison of two scores!

PVAAS Provides:

Public Site

Looking Back/Evaluation…

Value-added Growth Reports For

Cohorts of Students

Looking

Forward/Planning…

PVAAS Projection Reports

For Individual Students and Cohorts of Students

Today

Key Concepts in Understanding

PVAAS Reporting

One of the inherent benefits of the PVAAS analyses is that all scores for all students are included in the analyses. Simplistic approaches are less precise and sometimes biased.

While PVAAS does use individual student data to yield value-added reporting, there are NO individual student measures of growth yielded from PVAAS

Value-Added & PVAAS

Pennsylvania’s model for value-added is called

PVAAS - the Pennsylvania Value-Added

Assessment System.

PVAAS is based on the EVAAS Methodology – the Education Value-Added Assessment

System.

The EVAAS methodology has been nationally reviewed and published.

PVAAS System

Uses EVAAS statistical methodology

• EVAAS: Education Value-Added

Assessment System

• Mixed Model Multivariate Longitudinal

Analyses

• Lead Developer: Dr. Bill Sanders, Univ. of

TN

Now: SAS, Inc.(Cary, NC) for

EVAAS/PVAAS

• Jim Goodnight, CEO

8 Year History of PVAAS

Pilot from 2002-2005: 100 Districts

Fall 2006: Reporting Grades 4 and 6 Reporting to all 501 Districts

Fall 2007: Grades 4-8 Reporting to all Districts

Fall 2008: Grades 4-8, & 11 Reporting to all

Districts

Fall 2009 and 2010: Full Reporting to 500 Districts

• Math, Reading, Science, Writing

• Grades 4-8, & 11

• Used as provision to meet AYP for NLCB – AYP Growth

Model

February 2011: Release of Public Reporting

Site https://pvaas.sas.com

PVAAS Data Provides

Information to:

Raise Achievement

Close Achievement Gaps

Decrease Dropouts

Increase College Readiness

Key Concepts in Understanding

PVAAS Reporting

 PVAAS reporting reflects the effectiveness of your district’s or school’s

Standards-Aligned System

 PVAAS reporting reflects the district’s or school’s system regarding curriculum, assessment, and instruction

PVAAS Password-Protected vs. Public Site

PVAAS yields data on Districts, Schools,

Grades, Subgroups, Students

Math, Reading, Science, Writing

Public

Protected

Achievement + Growth

Achievement results (PSSA) and growth results (PVAAS) must be used together to get a complete picture of student learning.

To view the achievement results of Pennsylvania's public districts/schools, go to: http://paayp.emetric.net/

Example: Achievement + Growth

Achievement vs. Growth - 4th Grade Math

56

0

PVAAS Growth Value

Example: Achievement + Growth

Same Schools

Achievement vs. Growth - 4th Grade Reading

63

0

PVAAS Growth Value

Overview of PVAAS

Public Reports &

Features:

Screen Shots, THEN

LIVE Web Demo

New PVAAS Login Page

https://pvaas.sas.com

PVAAS Public Reports

Use of Reports Tab to Select & View Reports

PVAAS Public Reports

Value-Added Summary Reports

• District/LEA and School Level data only

• Math and Reading

• Grades 4-8 and 9-11

School Search Capability

• Allows users to find and view the progress of local schools, charter schools, and full-time

CTCs across Pennsylvania.

• Can search for similar schools based on grade levels tested, various demographics,

Intermediate Unit (IU) region and/or county..

PVAAS Public Reports

• Use of Tests Tab to View

Reports at Different

Grade Levels

• Use of Subjects Tab to

View Reports for

Different Subjects

Reports on Public Site

They have a different format from the district password protected site

However, the measures on the

PVAAS public site come directly from the reports on the district/school password protected site.

We will cross walk between these two sites in this session.

3 Key Resources for Public

Reporting

Guide to Public Reporting

Crosswalk

Two pager on public reporting

Purpose of District & School

Value-Added Data

Provides users with information to assist them in evaluating the overall effectiveness of a district/LEA or school on the academic progress of groups of students.

• This report is NOT a report on teacher effectiveness!

Example of District Value-Added

Summary Report

Grades 4-8, Math & Reading

Example of School Value-Added

Summary Report

Grades 4-8, Math & Reading

Crosswalk: Where can I find this information on other reports?

Public Site: School Value-Added Summary Report

Restricted Site: School Value-Added Report

What is the Average Gain over Grades

Relative to the Growth Standard?

• Represents the average gain across the grade levels served between 4 and 8 compared to the Growth

Standard.

• It is the average academic growth of the district’s or school’s students, compared to the Growth Standard.

• Answers the question, “ How effective was the district/LEA/school in impacting the academic progress of its students compared to the Growth Standard?”

Example: Average Gain over Grades

Relative to the Growth Standard

Average Gain over Grades Relative to the Growth Standard (on the public report) is the SAME as the Mean NCE Gain over Grades Relative to the Growth

Standard on the School Value-Added Report (password-protected site)!

0.2

4 th Grade Gain

2.1

5 th Grade Gain

2 1.7

Average Gain over Grades

Relative to Growth Standard

What is the Average Gain over

Grades Relative to the State?

• Represents the average gain across the grade levels served between 4 and 8 compared to the average progress of all students in Pennsylvania at the same grade levels.

• It is the average academic growth of the district’s or school’s students, compared to the academic growth of students statewide.

• Answers the question, “ How much did the district/LEA/school impact the academic progress of its students compared to the progress of other students in

Pennsylvania in those same grade levels?”

Example: Average Gain over Grades

Relative to the State

Average Gain over Grades Relative to the State (on the public report) is the

SAME as the Mean NCE Gain over Grades Relative to the State on the School

Value-Added Report (password-protected site)!

1.7

Average Gain over Grades

Relative to Growth Standard for the school

?

Average Gain over Grades

Relative to Growth

Standard for the State

0.4

Average Gain over Grades

Relative to the State

What is Average Gain over Grades Relative to the Growth Standard for the State?

3.0

Grade 4 State

3-Yr-Avg

-0.4

Grade 5 State

3-Yr-Avg

2 1.3

Average Gain over Grades Relative to Growth Standard for the State

Example: Average Gain over Grades

Relative to the State

Average Gain over Grades Relative to the State (on the public report) is the

SAME as the Mean NCE Gain over Grades Relative to the State on the School

Value-Added Report (password-protected site)!

1.7

Average Gain over Grades

Relative to Growth Standard for the school

1.3

Average Gain over Grades

Relative to Growth

Standard for the State

0.4

Average Gain over Grades

Relative to the State

PVAAS Value-added

Growth Descriptors

Grades 4 – 8, Math and Reading

The Average Gain over Grades for grades 4-8 is expressed in Normal Curve

Equivalent (NCE) units. The use of the NCEs allows PSSA scores in any school year and grade level to be compared across years.

Green (Favorable) – The district/LEA/school was effective in supporting students to achieve one year’s worth of academic growth in a year.

Yellow (Caution) – There was minimal evidence that the district/LEA/school was not effective in support students to achieve one year’s worth of academic growth in a year.

Rose (Concern) – There was moderate evidence that the district/LEA/school was not effective in support students to achieve one year’s worth of academic growth in a year.

Red (Strong Concern) – There was significant evidence that the district/LEA/school was not effective in support students to achieve one year’s worth of academic growth in a year.

Example of District Value-Added

Summary Report

Grades 9-11, Math & Reading

Example of School Value-Added

Summary Report

Grades 9-11, Math & Reading

Crosswalk: Where can I find this information on other reports?

Public Site: School Value-Added Summary Report

Restricted Site: School Value-Added Report

PVAAS Value-added

Growth Descriptors

Grades 9-11, Math and Reading

The District/School Effect for grades 9-11 is expressed in PSSA scaled score points.

Green (ABOVE Predicted Achievement) – The district/LEA/school was highly effective. The district/LEA/school exceeded the expected progress with its students.

Yellow (MET Predicted Achievement) – The district/LEA/school was effective. The district/LEA/school met the expected progress with its students.

Rose (BELOW Predicted Achievement) – The district/LEA/school was not effective. The district/LEA/school did not meet the expected progress with its students.

What is the District/School Effect?

• Provides an estimate of the district’s or school’s impact on students’ academic progress. Specifically, the District/School Effect is a function of the difference between the observed/actual PSSA achievement and the predicted PSSA achievement.

•It is a measure of growth that the students tested in grade 11 have made over the past 3 years since being tested in grade 8 and uses data from grades 3-8.

• If students score as expected (i.e., students’ observed scores are equal to their predicted scores), then the District/School Effect would be 0. A negative District/School Effect indicates students’ actual scores were lower than their predicted scores, while a positive

District/School Effect indicates students’ actual scores were higher than their predicted scores.

• Answers the question, “ How effective was the district/LEA/school in promoting student academic growth and supporting students to meet or exceed their expected progress?”

Is it appropriate to compare the amount of progress made by a district/school to another district/school?

No! Not using the District and School Value-Added

Summary Reports o

Without taking the Standard Error into account, it is NOT possible to directly compare these gain values across districts/schools.

• The color-coding of the growth measures (gain values) does in fact take into account the Standard Error. o

Note the link to the color code legends on each report.

• The Average Growth Index found on the School Search report feature takes the Standard Error into account and allows a more direct comparison across schools.

PVAAS Public Reports

Use of Reports Tab to Select & View Reports

Purpose of School Search

Users can find and view the progress of public schools across

Pennsylvania and search for similar schools based on grade levels tested, various demographics,

Intermediate Unit (IU) region, and/or county.

School Search

School Search

Where can I find this information on other reports?

Public Site: School Search

Restricted Site: School Search

Which schools are included when I use School Search?

Schools with at least one tested grade in common as the “reference school” you selected AND any demographics selected, as well as IU or county region.

Example: Your reference school is a grade 6-8 school

• Other schools included in the search may include grade 6-7 schools, grade 7-8 schools,

K-6 schools, etc.

How are schools compared?

PVAAS: Average Growth Index

What is an Index?

• A numerical scale used to compare variables with one another or with some reference number

Analogy: Consumer Price Index

• A measure of the average change over a period of time

• Statistical Indicator

• Reflects patterns

The PVAAS Average Growth Index allows viewers to compare growth across schools.

What is the Average Growth Index?

• A measure of student progress across the tested grade levels in a school.

• This index is a value based on the average growth across grade levels and its relationship to the standard error so that comparison among schools is meaningful. If the standard error is not accounted for, users might get a skewed picture of the relative effectiveness of different schools.

• For grades 4 through 8, the Average Growth Index is calculated by dividing the Average Gain over Grades Relative to the Growth Standard by the corresponding Standard Error.

•For grades 9 through 11, the Average Growth Index is calculated by dividing the School Effect by the corresponding

Standard Error.

Example: Average Growth Index,

Grades 4-8

Average Growth Index

= Average Gain over Grades Relative to the Growth Standard divided by the

Standard Error

Example: Average Growth Index,

Grades 4-8

Average Growth Index

= Average Gain over Grades Relative to the Growth Standard divided by the

Standard Error

= 2.9 / 0.4

= 7.3 (due to rounding this may NOT be exactly what is reported on public site)

Example: Average Growth Index,

Grades 9-11

Average Growth Index

= School Effect divided by the Standard Error

Example: Average Growth Index,

Grades 9-11

Average Growth Index

= School Effect divided by the Standard Error

= -13.7 / 9.1

= -1.5 (due to rounding this may NOT be exactly what is reported on public site)

What is the Average Growth Index?

Average Growth Index > 0  On average, students in the school achieved a year’s worth of academic growth in a year. A large, positive Average Growth Index provides more evidence that more than a year’s worth of growth was experienced by the average student in the school.

Average Growth Index < 0  On average, students in the school achieved less than a year’s worth of academic growth in a year. A large, negative Average Growth Index provides more evidence that less than a year’s worth of growth was experienced by the average student in the school.

Why I cannot find a district/school on the PVAAS public site?

Districts/schools with less than 10 students have been suppressed

PA public reporting requirements

Districts/schools that do not receive

PVAAS reporting are not included

Example: K-3 school

PVAAS reporting is provided for reading and mathematics in grades 4-8 and 11

Why I cannot find a district/school on the PVAAS public site?

Districts/schools that only receive growth reports reflecting the progress of ONE grade level are suppressed:

Example: Grade 11 only school

Example: Grade 11-12 schools

Example: Grade 6 only school

Example: K-4 school

Act 104 legislative intent was district and building level data

Website https://pvaas.sas.com

The Myths of PVAAS

Myth #1: PVAAS provides growth measures for an individual student.

PVAAS does not estimate growth for one student because:

• The PSSA observed scores (and resulting NCE scores) are simply ‘snapshots in time’ making comparisons of the observed scores as a measure of growth very unreliable.

• The error in an estimate for a data set with only one record (one student) is too large to make the estimate meaningful…and

• Error depends on variation in data AND the sample size

(the number of student records in a dataset).

An estimate of progress, or growth, based on only one student would have a much larger error, and therefore be considerably less precise than when considering a group or cohort of students.

Myth #2: Growth is correlated with certain demographic variables.

There is NO relationship between demographic variables, such as socioeconomic status, and growth.

• There are high achieving schools making high growth;

• There are high achieving schools making low growth;

• There are low achieving schools making high growth;

• There are low achieving schools making low growth.

Growth reporting reflects what WE do with students in terms of academic growth in schools/districts.

-- VAAS can remove the effects of factors not under the control of the school. (McCaffrey, Lockwood, Koretz & Hamilton, 2003; Ross, Wang,

Sanders, Wright & Stringfield, 1999a; Wright, Horn & Sanders, 1997).

Myth #3: Growth (grades 4-8) is calculated based on how other schools perform.

Each year, a group’s growth is calculated by comparing its position in the current grade distribution from 2006 to its former position in the previous grade’s distribution from 2006.

Performance of other groups in a given year does

NOT affect the growth calculation of the cohort in question. Each group becomes its own control group!

Growth Analogy:

• For a child to get taller, another child does not have to get shorter!

• A child can grow taller in a given year no matter how his/her peers grow.

Myth #4: Since PSSA distributions change each year, growth (grades 4-8) is based on a moving target. (We could never get a green.)

All grade 4-8 growth calculations and interpretations are based on the base year distributions from 2006.

• The 2006 PSSA Math and Reading distributions provide

typical demonstrations of achievement of cohorts as they progress through the grade levels.

PSSA scaled scores are converted to NCE units using the parameters from the 2006 distributions so they are relative to the same standard each year.

SAS, Inc. (the vendor for PVAAS) evaluates each year’s distributions to verify that using the base year of 2006 continues to be appropriate.

Myth #5: PSSA is not designed to discriminate well at the extremes so growth cannot be calculated using the PSSA.

PSSAs are designed to have sufficient stretch to discriminate between Below Basic, Basic,

Proficient, and Advanced performance levels.

There is no ceiling on the PSSA! - PDE

• Each year, scores are scaled to allow the high end to be scaled on the distribution of the data – not on a fixed, pre-determined value.

The PSSA meets the three conditions to be used in PVAAS analyses.

• Must be aligned to curriculum standards.

• Must be reliable and valid.

• Must demonstrate sufficient stretch at the extremes.

Myth #6: PVAAS is not reliable or valid since it is only based on only one test, the

PSSA.

PVAAS uses a multivariate, longitudinal mixed effect model in its analyses. It is not a simple comparison of two test scores!

All prior assessment scores are used.

Standard error is always reported.

PVAAS is an indicator of growth, or progress, of groups of students towards mastery of the

Pennsylvania academic standards.

Myth #7: If students are already high achieving, it is harder to show growth.

In PVAAS, one year’s growth is about maintaining achievement levels (grades 4-8) or meeting expected performance (grades 9-11) based on a specific group’s prior academic performance.

For high-achieving groups, one year’s growth may be sufficient or acceptable.

For low-achieving schools, one year’s growth may not be sufficient or acceptable in order for students to meet long-term achievement goal of proficiency.

Myth #8: It is not possible to show progress with all groups of students, such as students with IEPs or gifted students.

 If assessments have enough “stretch” to measure the achievement of both low- and high-achieving students, it is possible to measure all groups of students’ progress.

The PSSA meets the criteria!

The value-added methodology used is sensitive to individual students’ achievement levels.

It measures growth from the end of one year to the end of the next year, regardless of whether a student performs below, at, or above grade level/proficiency.

Myth #9: PVAAS should always indicate growth if our percent of students proficient/advanced increased since last year.

When following the same grade level from one year to the next in determining the percent proficient/advanced, these are two different groups of students (i.e., 6th graders in 2010 are not the same group of students as 6th graders in 2011). PVAAS on the other hand, is looking at the most recent group of students and evaluating their progress from the prior school year in the prior grade level (same group of students).

PVAAS is not measuring progress by students increasing or decreasing entire performance levels. PVAAS is sensitive to subtle changes in progress, even within performance levels.

Example: Some students may have moved from a non-proficient to proficient status. However, students already proficient/advanced may be

“slipping” in terms of their level of achievement compared to where they were the year prior. In other words, students may still be proficient/advanced, just not as high within those performance levels as they were in the prior year.

Myth #10: PVAAS cannot measure the progress of districts and schools with high mobility rates.

Value-added analysis includes all students, for which there are sufficient test data, including highly-mobile students.

From a statistical perspective, it is important to include highly-mobile students in the analysis because their exclusion could bias the results.

From a philosophical perspective, all students must be included in the school’s analysis to ensure that highly-mobile students receive the same level of attention as non-mobile students.

The EVAAS modeling approaches do take into account the quantity and quality of information available for each student!

Disclaimer for ANY Data Tool

NO data source should ever be considered in isolation.

ALL education decisions should be made on the basis of multiples sources of both quantitative and qualitative data.

ALL data provides indicators of phenomena.

When new data is gathered…

The intelligent user of data should ask:

Do these data provide insights that have not been available before?

Are these data consistent with data already collected?

Do these data confirm or conflict with our existing profile of students or programs?

What other data should be investigated in light of the new profile?

Suggested Resources for Communicating

PVAAS

Communication Resource

See “PVAAS Key Communication

Messages” – NEW!

• Key messages if you get a call from the press/media

• Or, if your IU wants to proactively assist districts, charter schools, full-time CTCs

District Press Release Template

Formal Press Release in development by PDE/PVAAS Statewide Core Team

PDE PVAAS Website, New Resources &

Professional Development Opportunities,

Intermediate Unit Supports

Suggested Resources

Guide to PVAAS Public Reporting

PVAAS Crosswalk

PVAAS Key Communication Messages

PVAAS Public Reporting Site Overview

PVAAS Evaluating Growth Projecting

Performance

PDE PVAAS

Webpage

• Help Menus

• PowerPoint

Presentations with Trainer

Notes

• Resource

Materials

• Video Clips

• District Case

Studies

Suggested Resources

Archived webinars and PowerPoint presentations detailing the public reporting site

Coming Soon! Podcasts

• Introduction to PVAAS

• Value-Added Reporting

• School Search

PVAAS Help Menus on

Public Reporting

Questions:

PVAAS Materials or

Statewide Implementation pdepvaas@iu13.org

717-606-1911

PVAAS Report Web Site https://pvaas.sas.com

www.pde.state.pa.us

333 Market Street

Harrisburg, PA 17126

Download