CM PE UsingVarietyAssessments

advertisement
District-Determined Measure Example
Using a Variety of Assessments (Formal and Informal)
Content Area and Grade Range: School Psychologists, grades PK-12
DDM Summary: This DDM measures the extent to which school psychologists
assess students’ needs using a variety of assessments and thereby have an indirect
impact on students’ learning by strengthening the conditions that support student’s
learning differences and needs. A second measure, also included, measures how
they would convey their findings effectively to the Team,
Developed by: Lydia Rundell-Gjerde, School Psychologist/Team Chairperson,
Lynnfield Public Schools
Pilot Districts: Lynnfield Public Schools
Date updated: June 2015
Table of Contents
Introduction ............................................................................................................ 2
Instrument ............................................................................................................... 5
Administration Protocol ........................................................................................ 6
Scoring Guide ......................................................................................................... 9
Measuring Growth and Setting Parameters ....................................................... 11
Piloting .................................................................................................................. 12
Assessment Blueprint ......................................................................................... 16
SEEM Collaborative – DDM – School Psychologists K-12 1
Introduction
Description of the Measure
This DDM is a target measure of the school psychologist’s indirect impact on students’
learning. Specifically, it measures the extent to which the school psychologist uses a
variety of student-specific informal and formal assessment methods to evaluate referred
students.
The measure includes the School Psychologist Report Checklist, to be used by the
school psychologist to document the range of assessment forms and methods used for
each student’s evaluation.
Potential Modifications
Although designed as a target measure for school psychologists, this measure can be
revised for use as a growth measure if the DDM objectives need improvement, rather
than maintenance. For example, the purpose of the three data analysis points in this
DDM – November, February, and June – is to monitor results and adjust practice if
needed. These data points could be used, however, to analyze growth over the course
of the year. This would require that the provided Target Parameters be adjusted to
serve as Growth Parameters.
Further, this measure was designed for school psychologists who have relatively small
caseloads. Collecting more data over time with a small sample size strengthens the
validity of inferences that may be drawn about the school psychologist’s assessment
methods and communication skills. For school psychologists with large caseloads,
however, a district may decide to limit the data collection to a subset of randomly
selected students. Alternatively, a district may choose to collect data for the full
caseload of students, but limiting the data collection period to two or three months when
student evaluations are most common during the school year.
Finally, a second component, a School Psychologist Communication Survey could
be included. This component solicits Team members’ perspectives on the effectiveness
of the school psychologist’s written and oral communication of the student’s evaluation
results.
DDM Objective and Rationale
A primary role of the school psychologist is to provide informative evaluations of
student’s learning needs. The psychologist also presents that information to the
student’s Team in a manner that provides Team members with a better understanding
of the student’s overall profile of strengths, weaknesses, and needs. Team members’
understandings impact the student’s ability to access the curriculum and make effective
progress in school. The school psychologist’s effectiveness in completing both of these
objectives indirectly impacts the student’s learning, i.e., it strengthens the instruction,
environment, and conditions that support learning.
This DDM assesses the school psychologist’s ability to use a range of relevant
assessments to evaluate students’ current functioning and needs. In gathering data that
SEEM Collaborative – DDM – School Psychologists K-12
2
supports the use of varied assessment methods, we can measure how effectively the
school psychologist fulfills this key area of responsibility. It can also identify adjustments
that need to be made in the psychologist’s assessment practices.
If the communication tool were used, the psychologist would also survey participants of
Team meetings regarding their understanding of the student’s profile as a result of the
school psychologist’s communication of findings. This would measure the psychologist’s
ability to communicate the results of these assessments in written reports and during
Team meetings, such that participants indicate that they have gained important
understandings as a result.
This DDM aligns with several standards in the Massachusetts School Psychologists
Association (MSPA) Rubric for Evaluation of School Psychologists, which is patterned
after the MA Specialized Instructional Support Personnel Rubric. It also includes two
additional elements – I.C.R Intervention, Monitoring, and Evaluation and III.C.3
Community Connections.
The MSPA Indicators below were selected as the focus for this DDM because they
indicate that the school psychologist’s evaluation of a student should include both
formal testing instruments, as well as informal methods. They help to provide a fuller
understanding of the student’s profile within the context of that student’s developmental
stage, environment, culture, and particular circumstances. Additionally, the school
psychologist’s student evaluation results must be communicated with all members of
the student’s Team in a manner that informs every one of that student’s strengths and
needs to support his or her learning and development. These key assessment and
communication skills are, in large part, how the school psychologist creates an indirect
effect on the conditions available to support students’ learning and success in school.
Content (Job Responsibility)
The school psychologist uses a variety of informal and formal methods of
assessment to evaluate students’ needs.
o Indicator 1-B-1: Variety of Assessment Methods (MSPA Rubric for the
Evaluation of School Psychologists)
The school psychologist communicates the identified student evaluation data
orally and in writing at Team meetings in a way that participants indicate that
they have new or better understandings and supports within the Team, e.g.,
special education teachers, regular education teachers, administrator/principal,
parents, other service providers, and student when applicable.
o Indicator 1-C-2: Sharing Conclusions with Colleagues (MSPA Rubric for
the Evaluation of School Psychologists)
o Indicator 1-C-3: Sharing Conclusions with Students and Families (MSPA
Rubric for the Evaluation of School Psychologists)
Weight
20 %
(Measure
#1)
80%
(Measure
#2)
100%
SEEM Collaborative – DDM – School Psychologists K-12
3
The above Indicators from the MSPA Rubric for Evaluation of School Psychologists are
further specified as follows:
Indicator 1-B.1 Variety of Assessment Methods: "Uses a variety of informal and
formal methods of assessment to measure student learning, growth, and
understanding to develop differentiated and enhanced learning experiences and
improve future instruction" (NASP Domain 1: Data-based decision making and
accountability and NASP Domain 8: Diversity in Development and Learning):
"Strategically selects from a variety of assessment methods (i.e., review of
records, observation, interview/rating scales, and testing) to assess student
learning, behavior, and development to account for student differences in culture,
language, level of functioning, and referral concerns."
Indicator 1-C-2 Sharing Conclusions with Colleagues (NASP Domain 3:
Consultation and Collaboration and NASP Domain 4: Interventions and mental
health services to develop social and life skills): "Presents key, relevant findings
to colleagues clearly, respectfully, and in sufficient detail to promote effective
collaboration that supports improved student learning and/or development."
Indicator 1-C-3 Sharing Conclusions with Students and Families (NASP Domain
7: Family-School Collaboration Services): “Presents key, relevant assessment
findings to students and families in a clear, concise, non-technical, respectful
manner, and engages them in constructive conversation to promote student
learning and development."
Description of the Development Process
This DDM was developed during October 2014 – June 2015 under a DDM Leadership
Grant (FC-217) awarded to the SEEM Collaborative by the Massachusetts Department
of Elementary and Secondary Education (ESE). In partnership with the Learning
Innovations Program at WestEd (Woburn, MA), the Collaborative convened three
school psychologists and three guidance directors representing middle and high schools
from five participating districts. Participants worked in smaller teams of one-to-three
people to strengthen and apply their assessment literacy toward the development of
several direct and indirect measures of student growth.
Participants grew their expertise over six sessions by engaging in a guided DDM
development process framed by a series of questions, including:
(1) What is most important to measure?
(2) How shall we measure what’s most important?
(3) How can we strengthen and refine our measure?
(4) How can we prepare our measure for broader use?
SEEM Collaborative – DDM – School Psychologists K-12
4
(5) What do we want to gain from the pilot?
(6) What did we learn from the pilot?
Throughout, participants engaged in large group discussion and critique, as well as
team collaboration and problem solving. In addition to refinements made during these
sessions, each measure was also strengthened based on feedback from an ESE review
team. Measures were then piloted from March-June 2015. Finally, the group analyzed
data collected during the pilot phase, which informed final revisions, as described in the
closing pages of this document.
Next Steps
Districts in and beyond the Collaborative now have the opportunity to decide if they
would like to implement or modify the attached assessment for use as a DDM for school
counselors and school psychologists. Because this is a newly developed measure, it is
important that districts engage building administrators in examining results from the first
year of implementation and identifying, over time, any revisions or refinements that may
further strengthen the quality of the assessment, scoring tools, administration protocol,
and/or growth parameters to suit the circumstances of each district’s local context.
Instrument
The purpose of the School Psychologist Assessment Checklist (Appendix A) is to
provide data to determine the extent to which the psychologist strategically selects from
a variety of assessment methods – i.e., review of records, observation, interview/rating
scales, and testing – to assess student learning, behavior, and development (MSPA
Rubric Indicator I-B-1).
The school psychologist uses the School Psychologist Report Checklist to document
the assessment tools used and the rationale for using these particular assessment tools
for every student evaluated. The school psychologist reviews the checklist results using
a rubric in November, February, and June. The psychologist also uses results to adjust
practices, if needed, to achieve the expected target parameters by June. Only the
summative evaluation at the end of the school year is scored to determine whether the
DDM’s target parameters were met.
The purpose of the School Psychologist Communication Survey (Appendix B) is to
provide data to determine the extent to which the school psychologist clearly presents
key, relevant findings in sufficient detail and with non-technical language to colleagues,
parents, and students (if attending the Team meeting). The intent is for the school
psychologist’s oral and written reports to engage Team participants in constructive
conversation and collaboration to promote student learning and development.
The school psychologist uses the School Psychologist Communication Survey to
gather information from Team members who participate in Team meetings regarding
their perceptions of the effectiveness of the psychologist’s oral and written
SEEM Collaborative – DDM – School Psychologists K-12
5
communication of his/her assessment results. Although the school psychologist would
gain additional information by including an item on the survey for respondents to
indicate whether they are classroom teachers, specialists, administrators, parents, or
students, this item was not included in order to protect respondents’ anonymity.
As with the School Psychologist Report Checklist, the school psychologist reviews
the survey results using a rubric in November, February, and June to support any
necessary adjustments to relevant practices. The psychologist also analyzes and
scores only the summative data at the end of the school year to determine whether the
DDM’s target parameters were met.
Administration Protocol
The Administration Protocol addresses how the measure is intended to be implemented
to best support a common conversation about student growth across classrooms.
When is the measure administered?
Data are collected following each student evaluation and subsequent Team meeting
throughout the school year at which evaluation results are presented and discussed.
The school psychologist completes the School Psychologist Report Checklist for
each student assessed. Thus, this checklist is maintained in an ongoing manner
throughout the school year and becomes part of the school psychologist’s evaluation
routine.
The school psychologist requests that Team members who attended entire Team
meetings, specifically the portion in which the school psychologist presented his/her
assessment results to the Team, complete and return the School Psychologist
Communication Survey following every Team meeting. The survey will be sent as a
Google Form Survey, or provided as a hard copy with a self-addressed and stamped
envelope as necessary, within two school days of the Team meeting. Thus, these data
are also collected in an ongoing manner throughout the school year and the data
collection becomes part of the school psychologist’s routine.
Entries from the Checklist and from the Survey are analyzed at three points during the
year. Data from the start of school through November are analyzed at the end of
November. Data from the start of December through February are analyzed at the end
of February, and data from the start of March through the June 1 (or a districtdetermined date that aligns with the educator evaluation cycle) are analyzed in June. A
summative evaluation is completed by compiling these results and comparing them to
the Target Parameters.
How is the measure administered?
The school psychologist maintains an electronic version (or hard copy if necessary) of
the School Psychologist Report Checklist to complete as student evaluations occur.
SEEM Collaborative – DDM – School Psychologists K-12
6
As different methods or tools of evaluation are included, the school psychologist adds
these to the checklist including the date, the administered assessment, and rationale or
purpose for using that particular assessment.
A drop-down menu is included in the electronic version to streamline recording the
rationale or purpose. It includes several options: Assessing area identified by parent;
Assessing area identified by referring teacher; Assessing area identified during
psychologist observation; Assessing area identified from previous test results;
Assessing areas designated in full battery of tests; Assessing additional areas of
potential challenge suspected by psychologist; Assessing area designated in available
assessment (rather than preferred or warranted assessment), and Other.
The school psychologist should also have the School Psychologist Communication
Survey prepared as a Google Form to send to Team members who attended the full
meeting within two school days following the meeting. If Google Forms is not available
for any reason, another online survey format, such as SurveyMonkey, may be used. A
hard copy may be provided to individuals who do not have Internet access, along with a
self-addressed and stamped envelope and instructions for how to return the completed
form to the school psychologist. When surveys are returned in hard copy, the school
psychologist or administrative assistant will need to enter the responses into a Google
Form to ensure that all results are compiled over time.
Although the survey provides basic information and instructions at the start, including a
request for its return within five days of the Team meeting, this is not a sufficient
introduction for most users. To ensure high response rates (at least 60-80% of those
attending Team meetings return the survey) and thoughtful responses, it is essential
that the purpose, intended use, time needed, and directions for the survey are explained
verbally to Team meeting participants, preferably in advance of the meeting.
Participants should also have an opportunity to ask clarifying questions before
responding.
Although these directions could also be provided in writing, via note or email, the
explanation must be simple and clear, and invite the participants’ questions in case
clarification is needed. Explanation could also be provided at the close of a Team
meeting. It should be expected, however, that there may not be sufficient time to explain
the process or respond to questions. Close-of-meeting reminders are best with those
who have already been introduced to or used the survey previously.
If the school psychologist is unable to gain at least a 60%-80% return rate on the
Communication Surveys distributed, despite multiple and varied attempts to increase
the return, the results must be interpreted with caution. The district should also consider
modifying the DDM or using a different DDM in future years.
SEEM Collaborative – DDM – School Psychologists K-12
7
Suggested Explanation from Psychologist to Participants (translated as necessary
for different audiences):
You will be asked to complete a brief, anonymous survey following [student’s] Team
Meeting on [scheduled date]. The survey will only take 3-5 minutes to complete. The
survey asks for your feedback on the clarity and usefulness of my communications in
(1) the written report that I prepare for you describing the student’s evaluation results
and recommendations, and (2) verbally, during my explanations in the Team meeting
itself.
I am asking all participants at Team meetings to complete this survey because I am
always working to ensure that my communications are clear and useful; if not, your
responses will help me to identify the ways in which I can improve. [For school
educators: This survey is part of my DDM that provides information about my impact on
the conditions supporting students’ learning.]
I will send you a link to the survey via email within two days of our meeting. If you’d
prefer, I can provide a hard copy of the survey for you, along with an envelope for its
return. I just ask that you please respond to the survey during the week following our
Team meeting. If for any reason you need to complete it later, please send it in as I
value your feedback. It would be best, however, to complete it as close to our Team
meeting time as possible.
Do you have any questions about this survey or how I intend to use the responses?
Special Accommodations
It is important to consider any language or cultural issues that may affect the parents’
ability to adequately complete the School Psychologist Communication
Survey. Explanations of the survey’s purpose, intended use, time needed, and
directions for completion for those whose first language is other than English should be
provided in the parent’s or student’s primary language; the survey should also be
provided in that language. In anticipation of this, it is recommended that the school
psychologist work closely with an ELL specialist to identify the most commonly spoken
languages in the school community and to develop survey explanations/directions and
accurate survey translations in those languages in advance so they are ready to
distribute.
If any Team participant is unable to complete the survey due to a physical disability, he
or she may request that an administrative assistant record his or her responses into the
Google Form in person at the school or over the phone. The school psychologist may
not collect these responses as this may bias the results.
Parents who attend the meeting but live at separate addresses are provided separate
surveys. Parents who attend but live at the same address may respond to a single
survey, with the option of receiving additional surveys for other family members who
attended the meeting.
SEEM Collaborative – DDM – School Psychologists K-12
8
How are deviations to protocols addressed?
The survey is anonymous; no specific follow-up will occur if Team meeting participants
do not complete it. As a result, it is essential that the school psychologist continuously
monitor the survey response rate to ensure that sufficient responses are returned. If not,
the psychologist must introduce strategies to increase the response rate. For example,
these might include: clarifying the purpose of the survey; providing hard copies of the
survey to staff and parents to illustrate the form’s brevity; encouraging return of the
survey through individual conversations prior to the Team meeting; or sharing results of
the survey back to Team members with reflections for improvement to illustrate how
survey responses are used.
Scoring Guide
Evaluation and scoring are accomplished through the following five steps:
Measure I: Assessment Checklist Rubric (Appendix C)
The school psychologist uses the Assessment Checklist Rubric to analyze and
monitor the extent to which checklist entries from September to November, December
to February, and March to June reflect a variety of student assessments and
appropriate use of this variety. If not, the psychologist makes adjustments to his or her
practice accordingly to achieve or surpass the expected Target Parameters.
In June, the school psychologist conducts a summative assessment of all three periods
(November, February, and June results), using the final rows of the Assessment
Checklist Rubric. The summative score for this first portion of the assessment is worth
20% of the final DDM score.
Scoring directions are provided on the Assessment Checklist Rubric; they state:
Directions:
Read the descriptors for levels 3, 4, and 5 under “Variety of Assessment Methods.”
Review the School Psychologist Assessment Checklist records for the designated time
period (Sept-Nov, Dec-Feb, or Mar-June). Circle the point value – 3, 4, or 5 –
associated with the level of performance demonstrated for this time period. Repeat
these steps for the “Strategic Selection” portion of the rubric.
At the end of the school year – i.e., at the designated time when DDMs results must be
calculated – record the average score for each of the two sections of the rubric in the
“Overall Average” row. For example, if the psychologist earned a 4 (Sept-Nov), a 4
(Dec-Feb), and a 5 (Mar-Jun) for “Variety of Assessment Methods,” the overall average
for this section would be the sum of 4+4+5 divided by 3, or 4.33. Next, in the final
“Combined Overall Average, add the two Overall Average scores. For example, if the
psychologist earned a 4.33 for the Overall Average of Variety of Assessment Methods
SEEM Collaborative – DDM – School Psychologists K-12
9
and a 4.66 for the Overall Average for Strategic Selection, the Combined Overall
Average would a 4.33 + 4.66, or 8.99 out of a possible score of 10 points.
Measure 2 Step 1: Communication Survey Score Sheet (Appendix D)
The school psychologist uses the Communication Survey Rubric to analyze and
monitor the extent to which Team meeting participants indicated that the psychologist’s
oral and written communications – from September to November, from December to
February, and from March to June – were clear and that they supported the
development of useful understandings and discussion.
The chart in Appendix D represents one way to track Communication Survey results
during each assessment period, i.e., Sept-Nov, Dec-Feb, and Mar-June. If using an
online survey tool such as Google Forms or SurveyMonkey, the school psychologist
may opt to use the data analytics provided with that tool, as long as the critical data are
calculated:
(1) Average score per item for the assessment period
(2) Average score per survey section for the assessment period
(3) Overall average score for the assessment period
These are shown in the bottom two rows of the following spreadsheet. Formulas can be
entered to automatically calculate these key data points, as shown in the example
below.
It is recommended that a separate score sheet be added to the Excel workbook for
each of the three assessment periods during the year.
Measure 2 Step 2: Communication Survey Response Rate Tracker (Appendix E)
For each assessment period, record the Total Number of Team Meeting Participants in
the second column – this equals the number of Communication Surveys distributed
during the period – and the Total Number of Surveys Returned in the third column.
Next calculate the Response Rate for each period by dividing the Total Number of
Surveys Returned (third column) by the Total Number of Team Meeting Participants, or
surveys distributed (second column).
Then add the three entries for the Total Number of Team Meeting Participants and
record in the Overall cell at the bottom of the column. Do the same to calculate the
Overall surveys returned.
When calculating the Overall Average Return Rate (lowest, far right cell in the table),
divide the Overall Number of Surveys Returned (bottom row, third column) by the
Overall Number of Team Meeting Participants (bottom row, second column). This
avoids rounding errors that will occur if you use each period’s response rate to calculate
the Overall Average Response Rate.
SEEM Collaborative – DDM – School Psychologists K-12
10
Example Response Rate Tracker
Sept-Nov
Dec-Mar
Apr-June
OVERALL
Total # of Team
Mtg Participants/
Surveys
Distributed
40
22
45
Overall
107
Total # of
Surveys
Returned
28
16
23
Overall
67
Response Rate
Target Return
Rate: 60-80%
70%
73%
51%
Avg Return Rate
63%
Step 4: Calculating the Overall DDM Score (Appendix F)
At the end of the school year (or at the designated time when DDMs results must be
calculated), the school counselor calculates the Final DDM score, where the
Assessment Checklist results account for 20% of the overall score and the
Communication Survey results account for 80% of the overall score.
First, the school psychologist enters the Combined Overall Average from the bottom
row of the Assessment Checklist Rubric in the area marked X, below.
Second, the school psychologist calculates an average of the three “Overall Average”
scores for the three assessment periods shown in the Communication Survey Score
Sheet and records this in the area marked Y, below. For example, if the school
psychologist earned a 60.6 for Sept-Nov, 72.0 for Dec-Feb, and 55.5 for Mar-Jun, the
total score for the year for the Communication Survey portion of the DDM would be
(60.6+72.0+ 55.5) divided by 3, or 62.7.
Finally, the school psychologist adds these two scores together and records as the
FINAL SCORE out of a possible 100 points.
This FINAL SCORE performance is then compared against the Target Parameters, as
described below.
Measuring and Settings Target Parameters
The development team discussed the number of easy, moderate, and hard items
included in the measurement tool. (See Assessment Blueprint. The team also discussed
the potential scores that would need to be achieved on the various items to indicate that
the school psychologist is meeting the expected target of using a variety of
assessments and conveying findings effectively to the Team. The team then estimated
the range of performance that would be considered meeting the MSPA standards and
noted this in terms of the “Moderate“ Performance parameters.
The team will use the pilot period to examine the average scores received by a group of
school psychologists and may modify these estimated target parameters accordingly.
SEEM Collaborative – DDM – School Psychologists K-12
11
Estimated Target Parameters
Target
Low Performance
Moderate Performance
High Performance
Final Score
< 65 pts
65-85 pts
>85 pts
Final Score
< 3 pts
3-5 pts
>5pts
Piloting
Our pilot plan is for the following educators, representing educators in and beyond the
DDM team at a range of schools, districts [and grade levels if relevant], to pilot this DDM
between the end of January 2015 and the end of May 2015:
Name
1 Lydia
RundellGjerde
Role
School
District
Grade
Level
DDM Team
Primary
Contact
School
Psychologist
Lynnfield
High School
Lynnfield
Public
Schools
9-12
Lydia
RundellGjerde
2
3
4
5
6
In particular, we aim to learn the following from our pilot:
1. To what extent does the school psychologist use a variety of student specific
informal and formal assessment methods to evaluate referred students?
2. To what extent at subsequent Team meetings does the school psychologist
communicate student evaluation results orally and in written reports in a way that
participants indicate supports their understanding of the student’s needs?
Development Process
This measure for School Psychologists was refined through both small trials and a
longer-term pilot from January – June 2015. Additionally, the assessment received
multiple peer and expert reviews during initial development. For example, staff at
WestEd (Woburn, MA) critiqued the assessment task and associated scoring materials.
An ESE review team also provided critical feedback, which led to further revisions.
SEEM Collaborative – DDM – School Psychologists K-12
12
Finally, a longer term pilot of the assessment in the developers’ own schools and
districts contributed to the development of increasingly specific, feasible, and
informative tools for potential users. Lydia presented the measure to other school
psychologists within the Lynnfield Public School District to review and consider for
piloting. Feedback noted that the School Psychologist Communication Survey would be
problematic to implement for a variety of reasons, e.g., due to concerns about the
evaluative nature of a survey for union purposes, and that many of those who would
potentially be surveyed may not be adequately able to provide such feedback. One
school psychologist was able to provide data for the purposes of completing the School
Psychologist Assessment Checklist, evaluating the variety and strategic selection of
assessments used in the psychological evaluation of students.
This challenge may be unique to the specific district at this specific time and the survey
could be piloted, as included, as a measure in another district.
Continued refinement of this measure is recommended for any district.
Pilot Results
Pilot Plan
Our pilot plan is for the following educators, representing educators in and beyond the
DDM team at a range of schools, districts, and grade levels if relevant, to pilot this DDM
between the end of January 2015 and the end of May 2015:
Name
1 Lydia
RundellGjerde
Role
School
District
Grade
Level
DDM Team
Primary
Contact
School
Psychologist
Lynnfield
High School
Lynnfield
Public
Schools
9-12
Lydia
RundellGjerde
2
3
4
5
6
In particular, we aim to learn the following from our pilot:
1. To what extent does the school psychologist use a variety of student specific
informal and formal assessment methods to evaluate referred students?
2. To what extent at subsequent Team meetings does the school psychologist
communicate student evaluation results orally and in written reports in a way that
participants indicate supports their understanding of the student’s needs?
SEEM Collaborative – DDM – School Psychologists K-12
13
Development Process
This measure for School Psychologists was refined through both small trials and a
longer-term pilot from January – June, 2015. Additionally, the assessment received
multiple peer and expert reviews during initial development. For example, staff at
WestEd (Woburn, MA) critiqued the assessment task and associated scoring materials,
and an ESE review team also provided critical feedback, which led to further revisions.
Finally, a longer term pilot of the assessment in the developers’ own schools and
districts contributed to the development of increasingly specific, feasible, and
informative tools for potential users. Lydia presented the measure to other school
psychologists within the Lynnfield Public School District to review and consider for
piloting. Feedback noted that the School Psychologist Communication Survey would be
problematic to implement for a variety of reasons, e.g., due to concerns about the
evaluative nature of a survey for purposes, and that many of those who would
potentially be surveyed may not be adequately able to provide such feedback. One
school psychologist was able to provide data for the purposes of completing the School
Psychologist Assessment Checklist, evaluating the variety and strategic selection of
assessments used in the psychological evaluation of students.
Pilot Results
Feedback from other school psychologists in reading this measurement focused on the
complexity of the data to be gathered and scored, as well as concerns regarding the
use of a survey method as stated above. The use of the School Psychologist
Assessment Checklist was thought to be beneficial in documenting the variety of
assessments, both informal and formal, used in evaluating our students. The pilot for
that part of this measurement was completed over the course of about five weeks, with
information from assessment of students at Lynnfield’s Middle and High Schools.
DDM Component(s)
Distribution of Scores
Central Tendency of Scores
School Psychologist
Lydia 1 (point value
scores)
Lydia 2 (actual #
assessments)
# of Assessment
Checklists
Completed
(target)
Combined
Combined
Overall
Overall
Median Score Mean Score
Average - Low Average - High
Score
Score
8
7
10
61
3
14
The results of the data that was gathered regarding the variety of assessments used in
our evaluations would seem to indicate that we consistently include at least three to four
and as many as 14 individual assessments as part of an overall student
evaluation. Those situations in which fewer assessments were used were for
reevaluations of students whose needs had previously been well documented. More
comprehensive assessments were completed for students whose needs or profiles are
more complex. Regardless of that, all students evaluations included a variety of informal
and formal assessments in order to document different aspects of the student’s profile,
strengths, and presentation, as well as current academic, social-emotional, executive
SEEM Collaborative – DDM – School Psychologists K-12
14
functioning, social, and behavioral needs. Additionally, all assessments were presented
with a strategic rationale explaining why it was included as part of the evaluation.
The scoring of this measurement may require some refinement, particularly in light of
excluding the survey as part of the larger DDM.
Discussion
Feedback from other school psychologists in reading this measurement focused on the
complexity of the data to be gathered and scored, as well as concerns regarding the
use of a survey method as stated above. The use of the School Psychologist
Assessment Checklist was thought to be beneficial in documenting the variety of
assessments, both informal and formal, used in evaluating our students. The pilot for
that part of this measurement was completed over the course of about five weeks, with
information from the assessment of students at Lynnfield’s Middle and High Schools.
The results of the gathered data regarding the variety of assessments used as part of
our evaluations would seem to indicate that we consistently include at least three to
four, and as many as 14 individual assessments as part of an overall evaluation of a
student. Those situations in which fewer assessments were used were for reevaluations
of students whose needs had previously been well documented. More comprehensive
assessments were completed for students whose needs or profiles are more complex.
Regardless of that, all students evaluations included a variety of informal and formal
assessments in order to document different aspects of the student’s profile, strengths,
presentation, and current academic, social-emotional, executive functioning, social, and
behavioral needs. Additionally, all assessments were presented with a strategic
rationale for why it was included as part of the evaluation.
The scoring of this measurement may require some refinement, particularly in light of
excluding the survey as part of the larger DDM.
SEEM Collaborative – DDM – School Psychologists K-12
15
Assessment Blueprint
SEEM Collaborative – DDM – School Psychologists K-12
16
Appendix A
School Psychologist Assessment Checklist (Excel sheet)
SEEM Collaborative – DDM – School Psychologists K-12
15
Appendix B
School Psychologist Communication Survey
The purpose of this survey is to solicit anonymous feedback from members of a student’s
educational Team regarding the effectiveness of the school psychologist’s oral and written
communication of evaluation results to the Team. The school psychologist will compile and
analyze survey responses over time to monitor and improve his/her communication skills.
Directions: Please place one checkmark in each row to show the extent to which you agree with
each of the following statements regarding the school psychologist’s written and oral
communication at the Team meeting you recently attended. Select just one response in each
row. This survey will take approximately five minutes to complete. Please aim to return this form
within five days of the meeting, if possible.
[For hard copy version: Please return this survey to: _______________or mail to: _________]
Strongly
Agree
Agree
(8)
(6)
Neither
Agree nor
Disagree
(4)
Disagree
Strongly
Disagree
(2)
(0)
Response to Written Report
1. The School Psychologist presented
assessment findings in the written report
clearly and respectfully.
2. The School Psychologist presented
assessment findings in the written report in
sufficient detail.
3. The School Psychologist presented
assessment findings in the written report using
non-technical, jargon-free language.
4. The School Psychologist’s written report
interpreted and connected assessment
results to the student’s performance, needs,
and subsequent recommendations.
Response to Oral Meeting Communications
5. The School Psychologist presented
assessment findings orally in the meeting
clearly and respectfully.
6. The School Psychologist presented
assessment findings orally in the meeting in
sufficient detail.
7. The School Psychologist presented
assessment findings orally in the meeting
SEEM Collaborative – DDM – School Psychologists K-12
16
using non-technical, jargon-free language.
8. The School Psychologist orally interpreted
and connected assessment results to the
student’s performance, needs, and subsequent
recommendations.
Overall Communication
9. The School Psychologist’s written and oral
communication of assessment findings
supported the Team’s ability to engage in a
constructive conversation about the
student’s learning needs.
10. As a result of the School Psychologist’s
written and oral communication of assessment
findings, I have a new or better
understanding about the student’s overall
learning profile, including strengths and needs.
SEEM Collaborative – DDM – School Psychologists K-12
17
Appendix C
Step I: Assessment Checklist Rubric
Directions: Read the descriptors for levels 3, 4, and 5 under “Variety of Assessment Methods.” Review the School Psychologist Assessment
Checklist records for the designated time period (Sept-Nov, Dec-Feb, or Mar-June). Circle the point value (3, 4, or 5) associated with the level
of performance demonstrated for this time period. Repeat these steps for the “Strategic Selection” portion of the rubric.
At the end of the school year – at the designated time when DDMs results must be calculated – record in the “Overall Average” row, the
average score for each of the two sections of the rubric. For example, if the psychologist earned a 4 (Sept-Nov), a 4 (Dec-Feb), and a 5 (MarJun) for “Variety of Assessment Methods,” the overall average for this section would be the sum of 4+4+5 divided by 3, or 4.33. Next, in the
final “Combined Overall Average, add the two Overall Average scores. For example, if the psychologist earned a 4.33 for the Overall Average
of Variety of Assessment Methods and a 4.66 for the Overall Average for Strategic Selection, the Combined Overall Average would a 4.33 +
4.66, or 8.99 out of a possible score of 10 points.
Variety of Assessment Methods
o Uses a variety of informal and formal methods of
assessment to measure student learning, growth, and
understanding to develop differentiated and enhanced
learning experiences and improve future instruction
Level 3
Level 4
Level 5
- Used some varied - Used varied
- Used full range
assessment types
of assessment
assessment
across students
types across
types across
- Rarely used more
students
students
than one assess- Commonly used
Sometimes used
ment type per
more than one
more than one
student
assessment type
assessment type
per student
per student
Sept-Nov
Dec-Feb
Mar-June
Overall
Average
Combined
Overall Avg
3 pts
3 pts
3 pts
4 pts
4 pts
4 pts
5 pts
5 pts
5 pts
Strategic Selection
o Strategically selects from a variety of assessment
methods to assess student learning, behavior, and
development to account for student differences in culture,
language, level of functioning, and referral concerns
Level 3
Level 4
Level 5
Rarely provides
Sometimes
Consistently
logical rationale for
provides logical
provides logical
selection of
rationale for
rationale for
assessments
selection of
selection of
assessments
assessments
3 pts
3 pts
3 pts
4 pts
4 pts
4 pts
5 pts
5 pts
5 pts
SEEM Collaborative – DDM – School Psychologists K-12
18
Appendix D
Step 2: Communication Survey Score Sheet
Example Communication Survey Score Chart for the First Assessment Period of the School Year
The school psychologist can monitor these survey data and, in response, make adjustments to communication practices. For example, as
shown above, the school psychologist might see that Question 4 (“The School Psychologist’s written report interpreted and connected
assessment results to the student’s performance, needs, and subsequent recommendations”) was a relative strength during September to
November, and that Question 5 (“The School Psychologist presented assessment findings orally in the meeting clearly and respectfully”) was a
relative weakness. Assuming that the survey response rate represents the targeted level of 60-80% of those attending the Team meetings,
these observations can begin to inform changes to the school psychologist’s practice.
SEEM Collaborative – DDM – School Psychologists K-12
19
Appendix E
Step 3: Communication Survey Response Rate Tracker
Directions: For each assessment period, record the Total Number of Team Meeting Participants in the second column – this equals the
number of Communication Surveys distributed during the period – and the Total Number of Surveys Returned in the third column.
Next, calculate the Response Rate for each period by dividing the Total Number of Surveys Returned (third column) by the Total Number of
Team Meeting Participants, or surveys distributed (second column).
Next, add the three entries for the Total Number of Team Meeting Participants and record in the Overall cell at the bottom of the column. Do the
same to calculate the Overall surveys returned.
When calculating the Overall Average Return Rate (lowest, far right cell in the table), divide the Overall Number of Surveys Returned (bottom
row, third column) by the Overall Number of Team Meeting Participants (bottom row, second column). This avoids rounding errors that will
occur if you use each period’s response rate to calculate the Overall Average Response Rate.
Total # of Team Mtg
Participants/ Surveys
Distributed
Total # of Surveys
Returned
Overall
Overall
Response Rate
Target Return Rate: 60-80%
Sept-Nov
Dec-Mar
Apr-June
OVERALL
Avg Return Rate
SEEM Collaborative – DDM – School Psychologists K-12 20
Appendix F
Step 4: Calculating the Final DDM Score
Directions: At the end of the school year (or at the designated time when DDMs results must be calculated), the school counselor calculates
the Final DDM score. The Assessment Checklist results account for 20% of the overall score and the Communication Survey results account
for 80% of the overall score.
First, the school psychologist enters the Combined Overall Average from the bottom row of the Assessment Checklist Rubric in the area
marked X, below.
Second, the school psychologist calculates an average of the three “Overall Average” scores for the three assessment periods shown in the
Communication Survey Score Sheet and records this in the area marked Y, below. For example, if the school psychologist earned a 60.6 for
Sept-Nov, 72.0 for Dec-Feb, and 55.5 for Mar-Jun, the total score for the Communication Survey portion of the DDM would be (60.6+72.0+
55.5) divided by 3, or 62.7 for the year.
Finally, the school psychologist adds these two scores together and records as the FINAL SCORE out of a possible 100 points.
DDM Score
Assessment Checklist Rubric Score (20%)
Combined Overall Average
Communication Score Sheet (80%)
Overall Average
Partial Scores
X
Y
FINAL SCORE
SEEM Collaborative – DDM – School Psychologists K-12
21
Download