Scoring and Reporting Guide for Administators

advertisement
Information about Scores, Data Files, and Score Reports
2014-15 School Year
Last Updated: 9/16/15
This document provides an overview of scoring and reporting in DLM for the 2014-15 school year for
states using the integrated assessment model. Additional resources are available on the SCORING AND
REPORTING RESOURCES website for your state.
Standard Setting and Performance Levels
DLM results are not based on raw or scale scores; all data is based on diagnostic classification modeling.1
Standard setting allows us to look at patterns of number of linkage levels mastered across the tested
Essential Elements, to which we can apply cut points to define categories of student performance. This
performance are reported using the four performance levels approved by the consortium:
 The student demonstrates emerging understanding of and ability to apply content knowledge
and skills represented by the Essential Elements.
 The student’s understanding of and ability to apply targeted content knowledge and skills
represented by the Essential Elements is approaching the target.
 The student’s understanding of and ability to apply content knowledge and skills represented by
the Essential Elements is at target.
 The student demonstrates advanced understanding of and ability to apply targeted content
knowledge and skills represented by the Essential Elements.
Each state will determine how the DLM PLDs translate into their own definitions of proficient or not
proficient. We will provide the General Research File (GRF) that includes the final DLM performance
level in each subject. You will apply your individual states’ accountability measures to the GRF to
determine proficient/non-proficient status for accountability purposes.
Standard setting is a consortium-wide process. The same standard setting methods are used for each
testing model, although the panels ran separately and different cut points were identified for integrated
versus year-end models. After standard setting, performance levels were put in the GRF and sent out to
state partners, which is the reason for the amount of time between testing windows and the GRF
release.
A detailed description of the standard setting method is provided in the document repository on the
state members’ area of the DLM website.
Data Files
There are three data files delivered to states at the end of the year:
 General Research File (GRF), which contains student results (e.g.,
“<state>_GRF_20150801_File_Structure.xlsx”
1
Further information about DLM psychometrics has been provided to consortium partner states in separate documents.
Guide to Score and Reports 2014-15 IM
9/16/2015
1 of 7

Date/Time Supplemental File, which provides date stamps for each student for each Essential
Element tested (e.g., “<state>_Date Time Supplemental File.xlsx”)
 Incident File, which lists students who were impacted by one of the known problems with
testlet assignments during the spring 2015 window (e.g., “<state>IncidentsAug2015.csv”)
The file structure for each of these files is located on the SCORING AND REPORTING RESOURCES page
(http://dynamiclearningmaps.org/srr/im).
The GRF and the date/time supplemental file house a great deal of information organized into columns.
If combined, the number of columns would be too large for some software to read. Therefore, the GRF
and supplemental files are provided separately and follow different structures. The date/time
supplemental file shows every EE the student was tested on, along with the start and end times. The
GRF reports the highest linkage level mastered and the final performance level. For more information,
see the File Structure Data Dictionary (.xlsx) on the Scoring and Reporting Resources page.
Several criteria were used to filter the data that were included in the GRF and date/time supplemental
file. These criteria included active statuses on student enrollments and rosters. This means that both the
roster and enrollment which caused the student to be on a roster were valid at the time that the data
were extracted from the system. In addition, only data from students tests that were date stamped as
having started during Field Tests B and C or the spring consortium window were included in the GRF. For
questions about specific students in the GRF, please submit a ticket to the help desk at DLMsupport@ku.edu.
Please note that it is possible for an EE to be listed for a given student in the Incident file that is not
listed in the date/time supplemental file if the incident resulted in items that could not be included in
scoring (e.g., no correct answer).
Another resource available to you on the SCORING AND REPORTING RESOURCES page is a sample GRF with ten
fictional records. Please note—state organizational tables ultimately dictate the presentation of the
data. The sample you receive might vary slightly from how the state-specific data will display in the final
GRF your state receives.
See the last page of this document for some Frequently Asked Questions about the GRF.
Score Reports
Individual student score reports are comprised of two parts: (1) the Performance Profile, which
aggregates linkage level mastery information for reporting on each conceptual area and for the subject
overall, , and (2) the Learning Profile, which reports specific linkage levels mastered for each tested
Essential Element. There is one score report per student per subject, unless the student has multiple
roster records in that subject.
There are five linkage levels: initial precursor, distal precursor, proximal precursor, target, and successor.
The performance levels reported on the Performance Profile are at a higher level of aggregation
particular to the grade and content area. The Learning Profile shows columns that correspond to the
linkage levels. They reflect a student’s overall performance as determined through a standard setting
process in summer 2015. There is no exact correspondence between a particular linkage level and an
overall performance level.
Student results are aggregated into several other types of reports. At the classroom and school levels,
roster reports list individual students with the number of Essential Elements tested, number of linkage
Guide to Score and Reports 2014-15 IM
9/16/2015
2 of 7
levels mastered, and final performance level. District- and state-level reports provide frequency
distributions, by grade level and overall, of students tested and achieving at each performance level in
English language arts and mathematics.
Students who were enrolled in Educator Portal but did not complete any of the assessment are excluded
from aggregated reports.
The most current score report prototypes for individual score reports and class, school, district, and
state aggregated reports are located at http://www.dynamiclearningmaps.org/srr/im.
All reports are provided in .pdf format. If you experience any technical difficulties with opening a .pdf
report, please follow the directions below:
1.
2.
3.
4.
Open any Adobe file
Go to Edit > Preferences > Security (Enhanced) and uncheck "Enable Protected Mode at startup"
Close all instances of Adobe Reader
Reopen the score report.
Please contact the DLM helpdesk at 1-855-277-9751 (toll-free) or DLM-support@ku.edu if the issue does
not resolve.
Guide to Score and Reports 2014-15 IM
9/16/2015
3 of 7
Delivery of Reports and Data Files
Data Files
The three data files (GRF, Date/Time Supplemental File, and Incident File) are shared via secure FTP or
server drive.
Individual Student Score Reports
Student reports are generated as separate PDF files. There is one PDF per student record in the GRF and
per subject. Individual student reports are packaged for delivery in folders, nested by district name,
school name, and grade. For states providing end of instruction assessments at the high school level,
individual student reports are organized into course files instead of grade. Files are named this way:
StudentLastName_FirstName_Contentarea
Individual student score reports are delivered either via server drive or DVD, and may be sent directly to
districts or just to the state. States were surveyed in early June about these delivery preferences.
Note: A score report are produced for every student record in the GRF. If the student had only values of
9 in the GRF (not assessed) for the EEs associated with a content area, the Learning Profile portion of the
score report will include the student information in the header, but each EE will have blank shading to
indicate no EEs were mastered.
Similarly, the student’s Performance Profile will include the student information in the header, but in
place of the body of the report, there will instead be a note indicating the student did not test in that
content area for the current academic year.
If a student was rostered more than once to the same test and displays more than once in the GRF, a
separate score report are produced for each record in the GRF.
Aggregated Reports
All 2015-16 aggregated reports are delivered to the state via Hawk Drive.
 Classroom and school-level reports are in folders by school, nested within district.
 District-level reports are in a single folder.
All files are delivered in PDF format.
Note: For students who did not test in a content area, the classroom and school roster reports include a
row for that student, with an indication that the student did not test in the content area. The district
and state reports do not include students who did not test in a content area in the frequency counts.
Guide to Score and Reports 2014-15 IM
9/16/2015
4 of 7
Timelines
The dates below reflect an updated timeline, as shared with states during the July 28th partner call.
Date
July 31
August 4
Event
TAC call to review impact data and smoothing
Governance review and cut score recommendations. Student lists distributed for SEA
review.
August 11
State responses due if students missing from lists (missing students = 1 week delay in
deliverables)
August 21
Data files* provided to states; score report files provided to states and/or districts
*GRF, date/time supplement file, and incident file are all delivered at the same time.
DLM Results and State Accountability and Reporting
There is a difference between assessment results and the use of assessment results in accountability
formulas and reporting. DLM data files and score reports are based on all data about the student’s
assessments during the year, in the school where they were assessed, at any time that they were
assessed. Each state in the consortium has different rules about how and where students’ results count
for accountability purposes. They also have unique rules about when a student’s results may be
invalidated based on partially completed assessments, assessments completed outside the testing
window, or mis-administrations, among other circumstances. There is no consortium-level definition of
participation for accountability purposes.
Each state is responsible for using the DLM-generated data files and applying accountability-related
rules that impact their own reporting practices. There are no preliminary score files provided in advance
of GRF delivery. The following options are available to help states expedite their accountability
calculations and score report distribution.
 Use the student enrollment extract available on demand in Educator Portal to begin screening
student demographic and location (i.e., school) data.
o This procedure allows the state to identify students whose records may need to be
adjusted in the GRF once the file is released.
o This procedure allows the state to identify duplicate students whose testing experience
may be split over the duplicate records. If those records are merged by June 10, the
single combined record was used for score reporting.
 States that have an internal review/QC process for score reports before they are released may
wish to either have individual student score reports sent to the state directly, or may have them
distributed directly to districts, with instructions not to release the reports until the state has
confirmed it is time to do so.
 DLM can make available the Word document prototypes for aggregated score reports to states
that wish to re-associate students to home schools for the purpose of aggregated reporting.
When score reports and data files are ready for release, we will also provide some resources to assist
with interpreting the information. This includes a parent interpretive guide to accompany the individual
student score report and a packet that state and district stakeholders may find helpful when
communicating with the public about aggregated results. These resources will be posted to the same
webpage where the other scoring and reporting resources are located.
Guide to Score and Reports 2014-15 IM
9/16/2015
5 of 7
Frequently Asked Questions about the GRF
Do you use school codes on the GRF?
School code is its own field. We have the code for school, district, along with the school and district
names that were included in the Educator Portal upload.
How are leading zeroes handled?
All leading zeroes are included when reporting organizational codes.
Can you clarify the numerical designations?
For each EE, the numerical designations are as follows. Additional numerical designations are located in
the Data Dictionary file.
0 = no evidence of mastery
1 = initial precursor mastered
2 = distal precursor mastered
3 = proximal precursor mastered
4 = target mastered
5 = successor mastered
9 = not assessed
What will populate in the cell when a student is tested but does not provide a response (distal and
higher) versus not tested at all?
We will include information on the highest linkage level mastered for every assessed Essential Element
based on mastery probabilities that are generated from student assessments. If the student has not
demonstrated mastery on any level, they will receive a “0” which indicates non-mastery. If they do not
test on the EE, they will receive a “9.”
What if a student appears on more than one roster?
If the rest of the student’s information (including State_Student_Identifier, subject, teacher, and grade)
is identical across multiple rosters, the student should have one row of data in the GRF and receive one
score report for each subject.
If the student has different identifying information across the records, the student will appear twice
with identical data. For example, a student who appears on two rosters will receive two score reports
with identical results. There would also be two rows for this student in the GRF and the student would
appear in the aggregated reports associated with each teacher/school.
What if a student appears in more than one district?
If the student was not exited from one district by June 12, the student will have two records in the GRF
with different district information and identical student data duplicated across the rows.
What if a student has two different IDs?
A student with two records that contain two different State_Student_Identifier values will receive
separate score reports for each ID, if the State_Student_Identifier values are associated with two
different DLM student IDs. Each report would only display the results for assessments taken under that
unique ID. If two State_Student_Identifier values are associated with a single DLM student ID, the report
will contain the results of all assessments.
Guide to Score and Reports 2014-15 IM
9/16/2015
6 of 7
In Integrated Model States, most of the teachers tested in Phases B and C and during the spring
testing window. What if there was a student that was only tested in one phase or only during the
spring window?
We will include linkage level mastery for all EEs on which students were tested. The remainder of EEs
not assessed would be assigned a “non-assessed” status (value of “9”).
How are students who did not test reflected in the GRF?
Students who are on active rosters but did not test are included in the GRF. If the student did not test on
any EEs for a content area, the student will receive a value of 9 for each EE in that content area. The
Performance Level for that content area will also include a 9 to indicate the student did not test in that
content area.
Guide to Score and Reports 2014-15 IM
9/16/2015
7 of 7
Download