stepE

advertisement

Step E. Collect Promising Existing Measures and/or Develop New Ones

Once the DDM team members have brainstormed all reasonable approaches to measuring educator impact on

Key ESE-Developed Resources to Support Step E

Technical Guide A (see especially pp. 11, 20, 22, 24–25, and 27) student learning in their program, they will be ready to start collecting

Implementation Brief: Scoring and Setting Parameters assessments and all associated documentation. All information collected will be documented and then used by the evaluators in Steps F and G to determine the overall appropriateness of each collected measure.

Team members will be looking for administration protocols, scoring guides or keys, rubrics, and guidelines for interpreting results. Along with the assessment itself, these documents comprise the most important elements of an assessment package. Team members will also be looking for answers to questions such as the following:

 When will the assessment be administered? How and by whom? Will all students be administered the assessment at the same time?

 How long is the test or activity?

 How will it be scored? Must scorers be trained?

 How will results be interpreted and used? What are the district’s parameters for low, moderate, and high growth?

Because this information will help ensure that all educators using the same DDM are administering, scoring, and interpreting results similarly across programs, the team will want to collect as much evidence as possible to support each assessment’s use as a potential DDM.

The anticipated outcome from Step E is a collection of assessments and supporting documentation that are deemed worthy of further consideration for use as a DDM.

19

Lessons from the Field: Developing the DDM “Package”

Genevieve Castillo, Educator

Montachusett Regional Vocational Technical

Ms. Castillo submitted an architectural drafting pre-test/post-test assessment for peer review during the ESE-sponsored convening on DDMs in January 2014. Her submission did not include administration and/or scoring guidelines at that time. Knowing that others seeking to “borrow” her measure would benefit from that type of information, Ms. Castillo worked with WestEd to develop those additional pieces.

WestEd provided Ms. Castillo with a template for developing an administration protocol and scoring guide. This template included the following sections:

Administration Protocol

Assessment Design

Preparation for the Post-Test

Number and Length of Assessment Sessions

Information for Students: Pre-Test

Information for Students: Post-Test

Special Accommodations

Scoring Guide

Scoring the Assessment

Interpreting the Results

Answer Key

WestEd drafted preliminary text in each section to jumpstart Ms. Castillo’s writing. This draft material was based on a review of her submitted pre-test/post-test assessment. For example, the assessment included 21 multiple-choice questions, 22 short-answer questions, and 7 true/false questions, so this information was inserted into the assessment design section of the administration protocol template. In addition, WestEd suggested that Ms. Castillo include information about the assessment’s development practices and refinements incorporated over time, so that potential users can make informed decisions about its appropriateness for various student populations.

In discussion with WestEd, Ms. Castillo quickly discovered that the process of developing an administration protocol and scoring process involved systematically documenting the options she considered and the decisions she reached at each stage of development. She realized that she had all of the relevant information, but had never thought to record it because she was the only educator administering and scoring the assessment. A key outcome from this exercise was Ms. Castillo’s realization that documentation was essential if colleagues were interested in using her measure.

Ms. Castillo’s extra effort will enable standardized administration of the measure and consistent scoring by CVTE colleagues who may want to utilize it as a DDM for their drafting CVTE program.

Please see Appendix A for more information about this assessment.

20

Download