Standard Protocol for Evaluating Progress Monitoring Tools

advertisement
NATIONAL CENTER ON INTENSIVE INTERVENTION
PROTOCOL FOR EVALUATING
PROGRESS MONITORING TOOLS
May 2012
The National Center on Intensive Intervention defines progress monitoring as repeated measurement
of academic performance for the purpose of helping schools individualize instructional programs for
students in grades K-12 who have intensive instructional needs. For this purpose, progress monitoring
is collected weekly to assess whether student progress is adequate to meet the student’s instructional
goal. If not, the teacher adjusts the instructional program to better meet the student’s needs and
continues to monitor progress. This process recurs throughout intervention to formatively develop an
effective, individually tailored instructional program. In the Center and in the document, we refer to
this use of progress monitoring as data-based individualization.
Please Read Before You Start
Q1.
My tool does not have enough alternate forms to conduct progress monitoring at
least weekly, as required for data-based individualization. Can I still submit my tool
for review?

No. The TRC will only review tools that meet its definition of progress monitoring, as follows:
The National Center on Intensive Intervention defines progress monitoring as repeated
measurement of academic performance for the purpose of helping schools individualize
instructional programs for students in grades K-12 who have intensive instructional needs.
For this purpose, progress monitoring is collected weekly to assess whether student progress
is adequate to meet the student’s instructional goal. If not, the teacher adjusts the instructional
program to better meet the student’s needs and continues to monitor progress. This process
recurs throughout intervention to formatively develop an effective, individually tailored
instructional program. In the Center and in the document, we refer to this use of progress
monitoring as data-based individualization.
Because this definition requires that progress monitoring be conducted at least weekly, the
TRC requires that tools submitted for review have at least 20 alternate forms. Please
include with this submission documentation of at least 20 alternate forms. The Center will not
distribute your protocol for review unless you demonstrate evidence of having at least 20
alternate forms. If your tool is computer delivered, please note how the test forms are derived
instead of providing alternate forms.
Q2.
My progress monitoring tool assesses multiple domains of academic performance
(e.g., reading vs. mathematics or math computation vs. concepts/applications). Do
I need a separate protocol for each domain?

Q3.
Yes. The Center recognizes that for products designed to measure progress in multiple
academic domains, some of the information to be submitted in the protocol will be the same.
However, the tool for each academic domain or subcomponent within a domain will be
evaluated and reported separately on our tools chart. Therefore, if your tool assesses more
than one domain/subcomponent, you MUST submit separate protocols for EACH
domain/subcomponent. For example, if your tool measures sub-components of reading, such
as letter name fluency, letter sound fluency, and passage reading fluency, you must submit a
separate protocol for each.
The protocol requires information that is already included in a technical report or
research study. Can I submit this study instead of filling out the protocol?

No. Technical reports and relevant research papers may be submitted as supporting
information, but you MUST COMPLETE THE FULL PROTOCOL. Reviewers will use the
information in the protocol to make their judgments. They are not expected to search for and
find additional information in accompanying materials.
Q4.
Q6.
The protocol requires information that is not currently available. Can I still submit
my progress-monitoring tool?

Yes. The Protocol for Evaluating Progress Monitoring Tools is designed to collect
comprehensive and detailed information on the submitted progress-monitoring tools to ensure
rigorous evaluation of tools. Therefore, tools that are undergoing improvements or are in an
early phase of development may not have all the information the protocol asks for. Please
provide as much information available as possible.

If it is found that your submission packet needs substantial amount of supplemental
information or is missing critical information, the entire packet will be returned to you. A
revised protocol packet with additional information may be re-submitted.
Can I submit a tool to be evaluated for BOTH General Outcome Measurement
(GOM) and Mastery Measurement (MM)?

Q7.
Yes. However, fill out BOTH GOM and MM sections in the protocol. Also, provide TWO
separate sets of evidence – one set which addresses the standards for GOM and another set
which addresses the standards for MM.
I am not familiar with some of the terms in the protocol, and thus, I am not sure
what information I should provide. What should I do?

Center staff are available to answer your questions or to assist you in completing the protocol
for submission. Please contact the National Center on Intensive Intervention:
National Center on Intensive Intervention
American Institutes for Research
1000 Thomas Jefferson Street, NW
Washington, DC 20007
E-mail: NCII@air.org
Phone: 866-577-5787
Q8.
Can I withdraw my tool from the review process?

No. Results of the review will be posted on the Center’s website, in the Progress
Monitoring Tools Chart. Once the review has begun, withdrawal from the process
is not permitted.
SECTION I: BASIC INFORMATION
A. TOOL
Title: Click here to enter text.
Developer: Click here to enter text.
Publisher: Click here to enter text.
Publication Date: Click here to enter text.
Contact Person: Name: Click here to enter text.
Telephone: Click here to enter text.
E-mail Address: Click here to enter text.
B. DESCRIPTIVE INFORMATION
Description of tool: Click here to enter text.
1. The tool is intended for use in grade(s) (Check all that apply):
Kindergarten
Fifth Grade
Tenth Grade
First Grade
Sixth Grade
Eleventh Grade
Second Grade
Seventh Grade
Twelfth Grade
Third Grade
Eighth Grade
Fourth Grade
Ninth Grade
2. The tool assesses one or more of the following dimensions (Check all that apply):
□ READING
Global Indicator of Reading Competence
Decoding
Listening Comprehension
Passage Reading
Vocabulary
Phonemic Awareness
Other:
Word Identification
Comprehension
Click here to enter text.
□ MATHEMATICS
Global Indicator of Mathematics Competence
Mathematics Computation
Early Numeracy
Mathematic Application
Mathematics Concepts
Fractions
Algebra
Other:
Click here to enter text.
1
SECTION I: BASIC INFORMATION
□ SPELLING ACHIEVEMENT
Global Indicator of Spelling Competence
Other (List specific skills or subtests): Click here to enter text.
□ WRITTEN EXPRESSION
Global Indicator of Written Expression
Competence
List specific dimension, skills, or subtests:
Click here to enter text.
□ OTHER
List specific skills or subtests:
Click here to enter text.
Notes:
Click here to enter text.
The tool provides information on student performance in:
English
Spanish
Other
Click here to enter text.
Acquisition Information:
Where to Obtain: Click here to enter text.
Address Click here to enter text.
Phone: Click here to enter text.
Web Site: Click here to enter text.
Cost per student for Year 1: Click here to enter text.
Including:
$ Click here Complete Kit (describe contents)Click here to enter text.
$ Click here Manuals and Test Materials Click here to enter text.
$ Click here Directions for Administration Click here to enter text.
$ Click here Test Forms
$ Click here Technical Manuals
$ Click here Protocol per Student
Other (Describe below):
$ Click here
$ Click here
2
SECTION I: BASIC INFORMATION
$ Click here
$ Click here
Cost per student for subsequent years: Click here to enter text.
Including:
$ Click here Complete Kit (describe contents) Click here to enter text.
$ Click here Manuals and Test Materials Click here to enter text.
$ Click here Directions for Administration Click here to enter text.
$ Click here Test Forms
$ Click here Technical Manuals
$ Click here Protocol per Student
Other (Describe below):
$ Click here
$ Click here
Online Costs:
$ Click here Start-up
$ Click here Per Student cost
Other (Describe below):
$ Click here
$ Click here
Additional Information on the tools:
Describe basic pricing plan and/or structure of the tools. Also, provide information on what is included in the
published tools, including information about special accommodations for students with disabilities.
Click here to enter text.
3
SECTION II: DEVELOPMENT AND ADMINISTRATION
A. TIME, ADMINISTRATION, AND FREQUENCY
Assessment format (Check all that apply):
Individual
Group
Computer-administered
Administration time: Click here minutes
Additional scoring time: Click here minutes
Discontinue Rules:
No
Basal
Ceiling
Other:
Click here to enter text.
s
s
How many alternate forms available? (Please attach each alternate form.)
Yes:
# of forms per grade, test level, or unit: Click here
B. TRAINING
Time required for training tester:
Less than 1 hour of training
1-4 hours of training
4-8 hours of training
Information not available
Minimum qualifications of the examiner:
Professional
Paraprofessional
Information not available
Training manuals and materials available:
Yes
No
Training manuals/materials field-tested:
Yes
No
Training manuals/materials included in cost of tool:
Yes
No
Sources for ongoing technical support available:
Yes
No
If yes please explain:
Click here to enter text.
4
SECTION II: DEVELOPMENT AND ADMINISTRATION
C. SCORING
Types of scores available:
1. Do you provide scores on level performance?
Yes
No
If yes, please check all that apply:
raw score
normal curve equivalents
composite scores
standard score
stanines
error analysis
percentile score
developmental benchmarks
grade equivalents
IRT-based score
subscale/subtest
scores
other (specify):
Click here to enter text.
Basis for calculating level standard & percentile scores:
age norms
grade norms
stanines
normal curve equivalents
Scoring Structure (specify how raw scores are calculated and what comprises cluster/composite
score): Click here to enter text.
2. Do you provide the basis for calculating slope? (i.e. amount of improvement per unit in time)
Yes
No
Do you provide benchmarks for the slopes?
Yes
No
Do you provide percentile ranks for the slopes?
Yes
No
Basis for calculating slope standard & percentile scores:
age norms
grade norms
stanines
normal curve equivalents
Scoring Structure (specify how raw scores are calculated and what comprises cluster/composite score):
Click here to enter text.
5
SECTION II: DEVELOPMENT AND ADMINISTRATION
Describe the tool’s approach to progress monitoring, behavior samples, test format, and/or scoring practices.
Click here to enter text.
Describe the tool’s methods for avoiding cultural or linguistic bias.
Click here to enter text.
6
SECTION III: TECHNICAL INFORMATION
What approach to progress monitoring is used?
General outcome measurement (Complete GOM 1-12.)
With GOM, alternate forms of the progress monitoring are of comparable difficulty or IRT-based
methods are used with each testing sampling the construct in the same way. With GOM, progress
toward a year-end goal is monitored.
Mastery Measurement (Complete MM 1-9.)
With MM, the objectives are targeted for mastery changes. That is, criterion-referenced
assessment on an objective continues with alternate forms of a test (each test has one type of item
on it) until mastery is achieved. Then, a new objective (the next one in the sequence) is targeted for
monitoring, etc.
General outcome measurement and Mastery Measurement (Complete BOTH GOM 1-12 and MM 1-MM9.)
Please provide TWO separate sets of evidence – one set which addresses the standards for GOM and
another set which addresses the standards for MM.
General Outcome Measures
A.
Psychometric Standards
GOM 1. In the chart below, report the reliability of the performance level score (e.g., internal
consistency, stability, test-retest reliability, alternate form reliability).
Type of
Reliability
Age or
Grade
n
(range)
Coefficient
range
SEM
median
Yes
Manual cites other published reliability studies:
*Provide citations for additional published studies.
Information (including
normative data) / Subjects
No
7
SECTION III: TECHNICAL INFORMATION
GOM 1a. Do you provide, in your user’s manual, reliability for the performance level score that is
disaggregated by race/ethnicity? If so, complete below for each race/ethnicity for which you
provide disaggregated reliability for the performance level score data.
Type of
Reliability
Age or
Grade
n
(range)
Coefficient
range
median
Yes
Manual cites other published reliability studies:
*Provide citations for additional published studies.
SEM
Information (including normative
data) / Subjects
No
GOM 2. In the chart below, report reliability for the slope (e.g., ratio of true slope variance to total
slope variance) by grade level (if relevant).
Type of
Reliability
Age or
Grade
n
(range)
Coefficient
range
median
Yes
Manual cites other published reliability studies:
*Provide citations for additional published studies.
SEM
Information (including normative
data) / Subjects
No
8
SECTION III: TECHNICAL INFORMATION
GOM 2a. Do you provide, in your user’s manual, reliability for the slope that is disaggregated by
race/ethnicity? If so, complete below for each race/ethnicity for which you provide
disaggregated reliability for the slope data.
Type of
Reliability
Age or
Grade
n
(range)
Coefficient
range
Manual cites other published reliability studies:
median
SEM
Information (including normative
data) / Subjects
No
Yes
*Provide citations for additional published studies.
GOM 3.
Type of
Validity
In the chart below, report validity information for the performance level score (e.g., content,
concurrent, predictive, and/or construct***)
Age or
Grade
Test or
Criterion
n
(range)
Coefficient
range
median
Information (including normative
data) / Subjects
Other forms of validity: Click here to enter text.
Yes
Manual cites other published reliability studies:
*Provide citations for additional published studies.
No
*** Validity information may also include: evidence based on response processes, evidence based on internal
structure, evidence based on relations to other variables, and/or evidence based on consequences of testing.
NOTE: To support validity, the TRC prefers and strongly encourages criterion measures that are external to
the progress monitoring system. If internal measures are used, please include a description of what
provisions have been taken to address the limitations of this method, such as possible method variance or
overlap of item samples: Click here to enter text.
9
SECTION III: TECHNICAL INFORMATION
GOM 3a. Do you provide, in your user’s manual, validity for the performance level score that is
disaggregated by race/ethnicity? If so, complete below for each race/ethnicity for which you
provide disaggregated validity for the performance level score data.
Type of
Validity
Age or
Grade
Test or
Criterion
Coefficient
n
(range)
range
median
Information (including normative
data) / Subjects
Other forms of validity: Click here to enter text.
Yes
Manual cites other published reliability studies:
*Provide citations for additional published studies.
No
*** Validity information may also include: evidence based on response processes, evidence based on internal
structure, evidence based on relations to other variables, and/or evidence based on consequences of testing.
NOTE: To support validity, the TRC prefers and strongly encourages criterion measures that are external to
the progress monitoring system. If internal measures are used, please include a description of what
provisions have been taken to address the limitations of this method, such as possible method variance or
overlap of item samples: Click here to enter text.
GOM 4. In the chart below, report predictive validity information for the slope of improvement
(correlation between the slope and achievement outcome).
**Please note, the TRC suggests controlling for initial level when the correlation for slope
without such control is not adequate.
Type of
Validity
Age or
Grade
Test or
Criterion
n
(range)
Coefficient
range
median
Information (including normative
data)/Subjects
Other forms of validity: Click here to enter text.
Manual cites other published validity studies:
Yes
No
*Provide citations for additional published studies.
NOTE: To support validity, the TRC prefers and strongly encourages criterion measures that are external to
the progress monitoring system. If internal measures are used, please include a description of what
10
SECTION III: TECHNICAL INFORMATION
provisions have been taken to address the limitations of this method, such as possible method variance or
overlap of item samples: Click here to enter text.
GOM 4a. Do you provide, in your user’s manual, predictive validity information for the slope of
improvement that is disaggregated by race/ethnicity? If so, complete below for each
race/ethnicity for which you provide disaggregated predictive validity for the slope of
improvement data.
Type of
Validity
Age or
Grade
Test or
Criterion
n
(range)
Coefficient
range
median
Information (including normative
data)/Subjects
Other forms of validity: Click here to enter text.
Yes
Manual cites other published reliability studies:
*Provide citations for additional published studies.
No
NOTE: To support validity, the TRC prefers and strongly encourages criterion measures that are external to
the progress monitoring system. If internal measures are used, please include a description of what
provisions have been taken to address the limitations of this method, such as possible method variance or
overlap of item samples: Click here to enter text.
B.
PROGRESS MONITORING STANDARDS
GOM 5. Provide evidence that alternate forms are of equal and controlled difficulty or if IRT based
please provide evidence of item or ability invariance (attach documentation of direct
evidence).
What is the number of alternate forms of equal and controlled difficulty? Click here
If IRT based, provide evidence of item or ability invariance (attach documentation): Click here
If computer administered, how many items are in the item bank for each grade level? Click here
If your tool is computer administered, please note how the test forms are derived instead of
providing alternate forms: Click here
GOM 6. Is minimum acceptable growth (slope of improvement or average weekly increase in score
by grade level) specified in your manual or published materials?
11
SECTION III: TECHNICAL INFORMATION
Yes
No
Specify the growth standards: Click here to enter text.
What is the basis for specifying minimum acceptable growth?
Norm-referenced
Criterion-referenced
Other (please describe):
Click here to enter text.
GOM 6a. If norm-referenced, describe the normative profile:
NOTE: If your tool is posted on our online tools chart, the Center will make the information
you provide below on the normative profile available to consumers.
National representation?
Yes
Local representation?
No
Yes
No
Date: Click here to enter text.
Number of States: Click here to enter text.
Size: Click here to enter text.
Regions: Click here to enter text.
Gender (Percent):
Male
Female
Unknown
SES (Percent, check all reported)
Low
Parents did not graduate high school
Middle
Parents graduated high school
High
Parents had 1-3 years of college
Parents had 4 or more years of college
Other SES Indicators: Click here to enter text.
Race/Ethnicity (Percent):
White, Non-Hispanic
American Indian/Alaska Native
Black, Non-Hispanic
Asian/Pacific Islander
Hispanic
Other
Unknown
ELL (Percent): Click here to enter text.
Disability classification (Percent): Click here to enter text.
GOM 6b. Do you provide, in your user’s manual, norms which are disaggregated by race or ethnicity?
If so, for which race/ethnicity? (Check all that apply)
Race/Ethnicity:
12
SECTION III: TECHNICAL INFORMATION
White, Non-Hispanic
American Indian/Alaska Native
Black, Non-Hispanic
Asian/Pacific Islander
Hispanic
Other
Unknown
GOM 6c. If criterion-referenced, describe procedure for specifying criterion for adequate growth
(attach documentation): Click here to enter text.
GOM 6d. Please describe other procedures for specifying adequate growth: Click here to enter text.
GOM 7. Are benchmarks for minimum acceptable end-of-year performance specified in your manual
or published materials?
Yes
No
Specify the end-of-year performance standards: Click here to enter text.
What is the basis for specifying minimum acceptable end-of-year performance?
Norm-referenced
Criterion-referenced
Other (please describe):
Click here to enter text.
Specify the benchmarks: Click here to enter text.
What is the basis for specifying these benchmarks?
Norm-referenced
Criterion-referenced
Other (please describe):
Click here to enter text.
GOM 7a. If norm-referenced, describe the normative profile:
If same as GOM 2, check here:
. Otherwise, complete below.
National representation?
Yes
Local representation?
No
Yes
No
Date: Click here to enter text.
Number of States: Click here to enter text.
Size: Click here to enter text.
Regions: Click here to enter text.
Gender (Percent):
Male
Female
Unknown
13
SECTION III: TECHNICAL INFORMATION
SES (Percent, check all reported)
Low
Parents did not graduate high school
Middle
Parents graduated high school
High
Parents had 1-3 years of college
Parents had 4 or more years of college
ELL (Percent) Click here to enter text.
Disability classification (Percent) Click here to enter text.
GOM 7b. If criterion-referenced, describe procedure for specifying benchmarks for end-of-year
performance levels (attach documentation): Click here to enter text.
GOM 7c. Please describe any other procedures for specifying minimal acceptable end-of-year
performance: Click here to enter text.
GOM 8. Describe evidence that the monitoring system produces data that are sensitive to student
improvement (i.e., when student learning actually occurs, student performance on the
monitoring tool increases on average): Click here to enter text.
C.
DATA-BASED INDIVIDUALIZATION STANDARDS
GOM 9. Does your manual or published materials specify validated decision rules for when changes
to instruction need to be made?
Yes
No
Specify the decision rules: Click here to enter text.
What is the evidentiary basis for these decision rules? Click here to enter text.
GOM 10. Does your manual or published materials specify validated decision rules for when changes
to increase goals?
Yes
No
Specify the decision rules: Click here to enter text.
What is the evidentiary basis for these decision rules? Click here to enter text.
GOM 11. Describe evidence that teachers’ use of the tool results in improved student achievement.
For this GOM, please describe and attach an empirical study that provides this evidence.
Study Citation: Click here to enter text.
14
SECTION III: TECHNICAL INFORMATION
GOM 11a.Study Sample
Number of students in product condition: Click here to enter text.
Number of students in control condition: Click here to enter text.
Describe characteristics of students in sample and how they were selected for participation in
study: Click here to enter text.
GOM 11b. Study Design
Was random assignment used?
Yes
No
If not, please specify the type of design: Click here to enter text.
What was the unit of assignment?
Schools
Teachers
Students
Teachers
Students
What was the unit of analysis?
Schools
What was the duration of product implementation? Click here to enter text.
Describe analysis: Click here to enter text.
GOM 11c. Study Fidelity
Describe when and how fidelity of treatment information was obtained:
Click here to enter text.
What were the results on the fidelity of treatment implementation measure:
Click here to enter text.
GOM 11d. Study Measures
In the table below, please list external outcome measures used in the study, along with psychometric
properties (e.g., Cronbach’s alpha, IRT reliability, temporal stability, inter-rater)
Measure name
Reliability statistics (specify type of reliability, e.g. Cronbach’s alpha, IRT
reliability, temporal stability, inter-rater)
GOM 11e. Study Results
Describe the results of the study:
Click here to enter text.
15
SECTION III: TECHNICAL INFORMATION
Report effect sizes for each outcome measure:
Measure name
Effect size
Summarize conclusions and explain conditions to which effects should be generalized:
Click here to enter text.
Other related references or information:
Click here to enter text.
GOM 12. Describe evidence that teachers’ use of the tool results in improved planning. For this GOM,
please describe and attach an empirical study that provides this evidence.
Study Citation: Click here to enter text.
GOM 12a. Study Sample
Number of students in product condition: Click here to enter text.
Number of students in control condition: Click here to enter text.
Describe characteristics of students in sample and how they were selected for participation in
Study: Click here to enter text.
GOM 12b. Study Design
Was random assignment used?
Yes
No
If not, please specify the type of design: Click here to enter text.
What was the unit of assignment?
Schools
Teachers
Students
What was the unit of analysis?
Schools
Teachers
Students
What was the duration of product implementation? Click here to enter text.
Describe analysis: Click here to enter text.
GOM 12c. Study Fidelity
Describe when and how fidelity of treatment information was obtained:
Click here to enter text.
16
SECTION III: TECHNICAL INFORMATION
What were the results on the fidelity of treatment implementation measure:
Click here to enter text.
GOM 12d. Study Measures
In the table below, please list external outcome measures used in the study, along with psychometric
properties (e.g., Cronbach’s alpha, IRT reliability, temporal stability, inter-rater)
Measure name
Reliability statistics (specify type of reliability, e.g. Cronbach’s alpha, IRT
reliability, temporal stability, inter-rater)
GOM 12e. Study Results
Describe the results of the study: Click here to enter text.
Report effect sizes for each outcome measure:
Measure name
Effect size
Summarize conclusions and explain conditions to which effects should be generalized:
Click here to enter text.
Other related references or information: Click here to enter text.
17
SECTION III: TECHNICAL INFORMATION
Mastery Measures
A.
Psychometric Standards
MM 1.
Report reliability coefficients for the tests incorporated in the system:
Type of
Reliability
Age or
Grade
n
(range)
Coefficient
range
median
Yes
Manual cites other published reliability studies:
*Provide citations for additional published studies.
SEM
Information (including normative data) /
Subjects
No
How many items comprise a single test?
MM 1a.
Do you provide, in your user’s manual, reliability coefficients for the tests incorporated in
the system that are disaggregated by race/ethnicity? If so, complete below for each
race/ethnicity for which you provide disaggregated reliability data.
Type of
Reliability
Age or
Grade
n
(range)
Coefficient
range
median
Yes
Manual cites other published reliability studies:
*Provide citations for additional published studies.
SEM
Information (including normative data) /
Subjects
No
18
SECTION III: TECHNICAL INFORMATION
MM 2.
Type
of
Validity
What is the correlation between number of skills mastered over the course of an academic
year with an important end-of-year outcome? Report these validity coefficients:
Age or
Grade
Test or
Criterion
n
(range)
Coefficient
range
median
Information (including normative data) /
Subjects
Other forms of validity: Click here to enter text.
Yes
Manual cites other published reliability studies:
*Provide citations for additional published studies.
No
*** Validity information may also include: evidence based on response processes, evidence based on internal
structure, evidence based on relations to other variables, and/or evidence based on consequences of testing.
NOTE: To support validity, the TRC prefers and strongly encourages criterion measures that are external to
the progress monitoring system. If internal measures are used, please include a description of what
provisions have been taken to address the limitations of this method, such as possible method variance or
overlap of item samples: Click here to enter text.
MM 2a.
Type
of
Validity
Do you provide, in your user’s manual, validity coefficients that are disaggregated by
race/ethnicity? If so, complete below for each race/ethnicity for which you provide
disaggregated validity data.
Age or
Grade
Test or
Criterion
n
(range)
Coefficient
range
median
Information (including normative data) /
Subjects
Other forms of validity: Click here to enter text.
Yes
Manual cites other published reliability studies:
*Provide citations for additional published studies.
No
*** Validity information may also include: evidence based on response processes, evidence based on internal
structure, evidence based on relations to other variables, and/or evidence based on consequences of testing.
NOTE: To support validity, the TRC prefers and strongly encourages criterion measures that are external to
the progress monitoring system. If internal measures are used, please include a description of what
provisions have been taken to address the limitations of this method, such as possible method variance or
overlap of item samples: Click here to enter text.
19
SECTION III: TECHNICAL INFORMATION
B.
PROGRESS MONITORING STANDARDS
MM 3.
Type of evidence for skill sequence (instructional hierarchy) on which MM system is based:
Logical
Describe Logic: Click here to enter text.
Expert judgment
Describe procedure for deriving these judgments: Click here to enter text.
Tied to curriculum
Describe evidence that this curriculum/program is research validated; attach journal
articles): Click here to enter text.
MM 4.
Describe evidence that the monitoring system produces data that are (a) sensitive to
children’s development of academic competence in this area and/or (b) sensitive to the
effects of effective interventions (attach journal articles).
Click here to enter text.
MM 5.
What is the basis for defending the pass/fail (mastered/nonmastered) decisions in the
system:
Click here to enter text.
Describe the sensitivity/specificity of these pass/fail decisions: Click here to enter text.
How are false negatives and false positives assessed for the benchmarks? (Criterion and
grade/age) Click here to enter text.
Negative
Positive
False
True
EVIDENCE OF SENSITIVITY
Odds ratios and conditional probabilities if given: Click here to enter text.
MM 5A.
Do you provide, in your user’s manual, sensitivity and specificity data that are
disaggregated by race/ethnicity? If so, complete below for each race/ethnicity for which you
provide disaggregated sensitivity and specificity data.
Negative
Positive
False
True
20
SECTION III: TECHNICAL INFORMATION
RELIABILITY OF DECISIONS
Calculations:
Specificity: TN / (TN + FP) = Click here to enter text.
Sensitivity: TP / (TP + FN) = Click here to enter text.
Hit rate: (TP + TN) / N = Click here to enter text.
C. DATA-BASED INDIVIDUALIZATION STANDARDS
MM 6.
Does your manual or published materials specify validated decision rules for when changes
to instruction need to be made?
Yes
No
Specify the decision rules here: Click here to enter text.
What is the evidentiary basis for these decision rules? Click here to enter text.
MM 7.
Does your manual or published materials specify decision rules for when to increase goals?
Yes
No
Specify the decision rules here:
Click here to enter text.
What is the evidentiary basis for these decision rules?
MM 8.
Click here to enter text.
Describe evidence that teachers’ use of the tool results in improved student achievement.
For this GOM, please describe and attach an empirical study that provides this evidence.
Study Citation: Click here to enter text.
MM 8a.
Study Sample
Number of students in product condition: Click here to enter text.
Number of students in control condition: Click here to enter text.
Describe characteristics of students in sample and how they were selected for participation in
study: Click here to enter text.
MM 8b.
Study Design
Was random assignment used?
Yes
No
If not, please specify the type of design: Click here to enter text.
What was the unit of assignment?
Schools
Teachers
Students
21
SECTION III: TECHNICAL INFORMATION
What was the unit of analysis?
Schools
Teachers
Students
What was the duration of product implementation? Click here to enter text.
Describe analysis: Click here to enter text.
MM 8c.
Study Fidelity
Describe when and how fidelity of treatment information was obtained:
Click here to enter text.
What were the results on the fidelity of treatment implementation measure:
Click here to enter text.
MM 8d.
Study Measures
In the table below, please list external outcome measures used in the study, along with psychometric
properties (e.g., Cronbach’s alpha, IRT reliability, temporal stability, inter-rater)
Measure name
MM 8e.
Reliability statistics (specify type of reliability, e.g. Cronbach’s
alpha, IRT reliability, temporal stability, inter-rater)
Study Results
Describe the results of the study:
Click here to enter text.
Report effect sizes for each outcome measure:
Measure name
Effect size
Summarize conclusions and explain conditions to which effects should be generalized:
Click here to enter text.
Other related references or information:
22
SECTION III: TECHNICAL INFORMATION
Click here to enter text.
MM 9.
Describe evidence that teachers’ use of the tool results in improved planning. For this MM,
please describe and attach an empirical study that provides this evidence.
Study Citation:
MM 9a.
Click here to enter text.
Study Sample
Number of students in product condition: Click here to enter text.
Number of students in control condition: Click here to enter text.
Describe characteristics of students in sample and how they were selected for participation in
Study: Click here to enter text.
MM 9b.
Study Design
Was random assignment used?
Yes
No
If not, please specify the type of design: Click here to enter text.
What was the unit of assignment?
Schools
Teachers
Students
What was the unit of analysis?
Schools
Teachers
Students
What was the duration of product implementation?: Click here to enter text.
Describe analysis: Click here to enter text.
MM 9c.
Study Fidelity
Describe when and how fidelity of treatment information was obtained:
Click here to enter text.
What were the results on the fidelity of treatment implementation measure:
Click here to enter text.
MM 9d.
Study Measures
In the table below, please list external outcome measures used in the study, along with psychometric
properties (e.g., Cronbach’s alpha, IRT reliability, temporal stability, inter-rater)
23
SECTION III: TECHNICAL INFORMATION
Measure name
MM 9e.
Reliability statistics (specify type of reliability, e.g. Cronbach’s
alpha, IRT reliability, temporal stability, inter-rater)
Study Results
Describe the results of the study:
Click here to enter text.
Report effect sizes for each outcome measure:
Measure name
Effect size
Summarize conclusions and explain conditions to which effects should be generalized:
Click here to enter text.
Other related references or information:
Click here to enter text.
24
Download