CM PE EffectiveCommunication

advertisement
District-Determined Measure Example
Effective Use and Communication of Social/Emotional Assessment
Data
Content Area and Grade Range: Social/Emotional Data, grades K-12
DDM Summary: This DDM assesses the ability of school psychologists, or other evaluators, to
clearly explain and connect social/emotional assessment results and subsequent classroom
recommendations through written reports and team meeting presentations.
Developed by: Colleen McDonald, School Psychologist (Whitman-Hanson Regional School
District), Sarah Hargrove, School Psychologist (Whitman-Hanson Regional School District), and
Tara-Jean Grabert, School Psychologist (Whitman-Hanson Regional School District)
Reviewed by: Sonya Meiran (ESE); Matt Hollaway (ESE); Craig Waterman (ESE)
Pilot Districts: Whitman-Hanson Regional School District, Wellesley Public Schools, Scituate
Public Schools
Date updated: June 2015
Table of Contents
Introduction ...................................................................................................................................... 2
Instrument ........................................................................................................................................ 3
Administration Protocol .................................................................................................................. 3
Scoring Guide .................................................................................................................................. 5
Measuring Growth and Setting Parameters ................................................................................... 9
Piloting ............................................................................................................................................ 10
Assessment Blueprint ................................................................................................................... 13
North River Collaborative – DDM – Social/Emotional Assessment Data 1
Introduction
This DDM is a target measure, rather than a growth measure, of the school psychologist’s indirect
impact on student learning. Specifically, it consists of a survey designed to solicit teachers’
perceptions of the extent to which the school psychologist’s verbal and written communication of a
student’s social/emotional assessment data has informed their understandings of a student’s
difficulties and how these impact the student’s learning. Additionally, it asks about teachers’
understandings of the psychologist’s recommendations, based on the reported social/emotional
assessment data, and their perceptions of whether these recommendations are feasible to
implement in the school setting as a means to addressing the student’s social/emotional challenges.
This brief survey consists of eight items that ask the respondents to indicate “agree” or “disagree” if
the listed expectation was met either during the team meeting or through the school psychologist’s
written report.
This assessment for evaluators is newly developed and has been refined through a series of small
pilots with teachers, but has not been officially administered as a DDM. Districts are encouraged to
adopt this DDM as designed, or refine and/or modify the survey, scoring template, and
administration protocol to suit their unique needs and local circumstances. For example, although
this tool was designed as a target measure, it can also be modified to serve as a measure of an
evaluator’s growth over time. Additionally, while this DDM focuses on the assessment of students’
social/emotional skills, districts can modify it to apply to a variety of assessment results, including
cognitive, academic achievement, speech/language, etc.
This measure is aligned to the following Core Course Objective (CCO): Evaluators will
communicate, orally and in writing, students’ social/emotional testing data and recommendations in
a way that teachers perceive as clear, relevant, and practical.
A CCO is a statement that describes core, essential, or high priority content—i.e., knowledge, skills,
or abilities—identified by those who designed the assessment, which is drawn, synthesized, or
composed from a larger set of curriculum or professional standards.
This CCO was identified as the basis for the DDM due to the increasing number of social/emotional
evaluations that school psychologists must complete. Additionally, the school psychologists’
communication of social/emotional assessment data plays a crucial role in determining services and
supports for students. To ensure that appropriate classroom services are delivered, teachers
working directly with students with social/emotional difficulties must understand the evaluation data
reported by the school psychologist. Teachers must also understand the recommendations that are
described in the reports and team meetings. This DDM serves the purpose of assessing the school
psychologist’s oral and written communication abilities when evaluating students with
social/emotional difficulties. It allows teachers to indicate if the school psychologist’s written and oral
reports are clear and support the teacher’s work with the student.
North River Collaborative – DDM – Social/Emotional Assessment Data
2
Content (Job Responsibility)
Weight
1. Evaluators communicate students’ social/emotional assessment data in ways
that teachers perceive as clear, both orally and in writing.
MSPA Standard I-C-2: School psychologists present key, relevant findings to
colleagues clearly, respectfully, and in sufficient detail to promote effective
collaboration that supports improved student learning and/or development.
50% of the
measure
2. Evaluators will communicate social/emotional recommendations in written reports
in ways that teachers perceive as clear
MSPA Standard I-C-1: School psychologists skillfully interpret assessment
findings and relate them to educational performance, needs, and
recommendations.
12.5% of
the
measure
3. Evaluators will communicate social/emotional recommendations in written reports
in ways that teachers perceive as relevant.
MSPA Standard I-C-1: School psychologists skillfully interpret assessment
findings and relate them to educational performance, needs, and
recommendations.
25% of the
measure
4. Evaluators will communicate social/emotional recommendations in written
reports in ways that teachers perceive as practical.
MSPA Standard I-C-1: School psychologists skillfully interpret assessment
findings and relate them to educational performance, needs, and
recommendations.
12.5% of
the
measure
100%
Instrument
This assessment is a brief eight-item survey that asks the respondents to indicate “agree” or
“disagree” if the listed expectation was met either during the team meeting or through the school
psychologist’s written report. For example, ”[the school psychologist] Explained the student’s
social/emotional assessment results clearly in the written report.” Four of the items measure the
educator’s perception of the clarity of the school psychologist’s presentation of the student’s
assessment data and four of the items measure perception of the clarity, feasibility, and relevance
of the school psychologist’s recommendations. There is also an open-ended comments section
should the rater wish to provide more specific feedback about his/her experience with the school
psychologist’s report or presentation in the meeting.
Administration Protocol
This administration protocol is provided to increase the likelihood that the assessment is
administered in a fair and consistent manner across all students, classrooms, and/or schools that
use the same DDM and over time. Adherence to this protocol will reduce the variation in local
decisions when administering the assessment; it will also increase the comparability of the data
collected.
North River Collaborative – DDM – Social/Emotional Assessment Data
3
When is the measure administered?
The survey tool was developed to elicit feedback from teachers following a student evaluation that
includes a social/emotional assessment. Throughout the school year, surveys are distributed to
teachers’ mailboxes following all TEAM meetings that involve the discussion of social/emotional
assessment results.
The school psychologist is expected to place a survey in a participating educator’s mailbox within 24
hours of the TEAM meeting. To ensure anonymity, an automatic email will be sent a week after
distribution to all teachers who received the survey, even if the individual teacher has already
returned the form. Those surveyed should be the individual(s) who are responsible for the
implementation of the recommendations made by the school psychologist. In some instances, those
surveyed will be the regular education classroom teacher; it is also appropriate for special education
teachers to provide feedback if they are the staff primarily responsible for meeting the student’s
needs. Those responsible for providing feedback must have received a copy of the evaluator’s
written report and must have attended the TEAM meeting.
How is the measure administered?
The school psychologist should prepare for each assessment by printing a copy of the survey and
cover letter prior to the TEAM meeting. In addition, it is beneficial to attaining a strong return on the
survey if the psychologist can touch base informally with the participating educators. The purpose of
that communication is to remind teachers that they will be receiving the DDM survey, what it is
about, particularly referencing the psychologist’s report and recommendations, and how the results
will be used.
The survey includes instructions for the educator to follow. The instructions read: Please take 5-10
minutes to complete the following survey regarding your student’s recent social/emotional
assessment and the recommendations offered in the report and presented in the meeting. Please
read each question carefully and put an “X” in the box under your corresponding answer. Please
complete all items. If any of the recommendations or results were unclear or confusing then your
response should be “disagree.” There is space at the end of this survey should you want to clarify
your responses or offer any additional comments.
Additionally, the school psychologist provides a cover letter that: (1) explains the purpose of the
survey, (2) outlines the process for survey return for the purposes of preserving anonymity and
increasing respondent willingness to participate, and (3) encourages the respondent to give honest
feedback for the purposes of guiding the school psychologist’s practice.
All teachers who attend Team meetings that fit the criteria for this DDM, which includes a
social/emotional assessment, will receive the survey. It is suggested that the school psychologist
have a minimum sample size of 10 surveys or 20% of total evaluation caseload involving
social/emotional components, whichever is greater. The school psychologist notes a specific return
date in a clear and visible place on the cover letter and places it in the teacher’s mailbox within 24
hours of the TEAM meeting. Teachers are asked to complete and return the survey to a designated
third party within one week; this person will vary across schools/districts. The school psychologist
North River Collaborative – DDM – Social/Emotional Assessment Data
4
will keep track of the number of surveys he/she had distributed and the number of surveys returned;
it is not necessary, however, for teachers to identify themselves on the form. After one week, the
school psychologist will send an automatic email reminder to all educators who received the survey
encouraging participation. These steps are necessary to ensure a high return rate for these surveys.
For the administration of the survey, no modifications or accommodations are needed. Teachers
should be encouraged, however, to consider the student’s language proficiency and other cultural
factors when determining the appropriateness of evaluator’s communication and recommendations.
How are deviations to protocols addressed? Designating a neutral third party—i.e., the school
secretary—as the individual who will receive the surveys should ensure a high rate of return. To
better ensure confidentiality, SISPs may also wish to only collect the surveys three times per year
so that the surveys would not be seen immediately after the team meeting in which they were
distributed, but later on. Finally, a district using this as a growth measure may wish to tally results
three times per year to monitor progress, rather than just at the conclusion of the year.
Scoring Guide
This assessment packet includes a Summary Scoring Template (below) to be used after the
surveys have been collected. Two scores can be derived from this instrument. The first score (to the
far right on the template) is a percentage of those respondents who responded “agree” on each
item. The percentages calculated for the individual items can be used as feedback to guide the
school psychologist’s future practice. The second score (bottom right corner) is an overall
percentage of “agree” responses, which indicates the level of the school psychologist’s impact in the
areas assessed in the survey tool. These percentages are later interpreted in relation to the Target
Parameters to determine whether the school psychologist demonstrated low, moderate, or high
impact on students using this DDM in this given year.
Scoring Process
Item Analysis
1. Tally all categories after all the surveys have been returned.
2. Record the Total Disagree and Total Agree responses for each row.
3. Record the Total Number of Responses for each row. Remember, educators may skip
some items, so these total numbers may vary across items.
4. Going across each row, divide the Total Agree number by the Total Number of
Respondents to determine the percentage of agreement. Record this percentage in the
final column. These scores can be used as specific feedback to guide future practice.
Overall Impact
1. Add the values in the Total Disagree column and record at the bottom of the column.
2. Add the values in the Total Agree column and record at the bottom of the column.
3. Add the values in the Total Number of Responses column and record at the bottom of the
column.
North River Collaborative – DDM – Social/Emotional Assessment Data
5
4. Divide the Total Number of Responses by the Total Agree to determine an overall
percentage of responses that indicate favorable perceptions. Record the percentage in
the final cell at the far, bottom right of the table. This score can be used to determine the
school’s psychologist’s level of impact. (See Target Parameters section.)
Who should score the assessment? Designating the School Psychologist should administer the
survey, collect it from the third party, and tabulate the scores.
How should scorers prepare for scoring? The school psychologist should have all survey
responses as well as a blank Summary Scoring Template in preparation for scoring.
North River Collaborative – DDM – Social/Emotional Assessment Data
6
DDM Summary Scoring Template
Item Analysis
1. After all the surveys have been returned, tally all categories.
2. Record the Total Disagree and Total Agree responses for each row.
3. Record the Total Number of Responses for each row. (Remember, educators may skip some items, so
theses total numbers may vary across items.)
4. Going across each row, divide the Total Agree number by the Total Number of Respondents to
determine the percentage of agreement. Record this percentage in the final column.
Overall Impact
1. Add the values in the Total Disagree column and record at the bottom of the column.
2. Add the values in the Total Agree column and record at the bottom of the column.
3. Add the values in the Total Number of Responses column and record at the bottom of the column.
4. Divide the Total Agree by the Total Number of Responses to determine an overall percentage of
responses that indicate favorable perceptions. Record in the final cell at the far, bottom right.
Total
Disagree
Assessment Data
Total
Agree
Total # of
Responses
Percentage of
Agree
(Total Agree/Total
Responses)
1. Explained the student’s
social/emotional assessment results
clearly in the written report.
2. Orally explained my student’s
social/emotional assessment results
clearly in the team meeting.
3. Orally explained the most critical
findings to help me understand the
student’s social emotional challenges.
4. Provided useful clarification about my
student’s social/emotional assessment in
response to other team members’
questions and/or comments.
Total
Disagree
Recommendations
Total
Agree
Total # of
Responses
Percentage of
Agree
(Total Agree/Total
Responses)
1. Wrote clear recommendations.
2. Provided rationale for why the
recommendations were given.
3. Provided recommendations that
address the student’s social/emotional
challenges.
4. Provided recommendations that are
feasible given the resources available
within the school/classroom setting.
Total Scores
AVG % Agree
Overall
North River Collaborative – DDM – Social/Emotional Assessment Data
7
How should Gain or Growth Scores be calculated?
This DDM is measuring an essential job function/responsibility. The design team indicated that
these functions are currently largely achieved in their districts, so their responsibility is to maintain
these high standards. As a result this DDM is designed as a target measure. If a district needs to
build capacity to achieve these functions, however, this DDM can be modified to serve as a growth
measure. This may be particularly useful for individuals new to the district and/or new to the role.
The school psychologist’s goal is for his or her Average % Agree Overall score (bottom, far right
on the template) to fall within the Moderate or High Target range, as specified in the Target
Parameters. (See Target Parameters section, below.)
A copy of a completed survey and scoring rubric based on 10 collected surveys are provided below
as illustration of the scoring process.
North River Collaborative – DDM – Social/Emotional Assessment Data
8
Measuring Growth and Setting Parameters
The development team discussed the number of easy, moderate, and hard items included in the
measurement tool—see Assessment Blueprint. They also discussed the degree to which the
various items indicate that the school psychologist is meeting the target of effectively
communicating social/emotional assessment results. Based on the discussion, the team estimated
the range of performance that would be considered meeting the MSPA standards and noted these
parameters as “Moderate” performance.
During the first year of administration, districts may wish to examine the average scores received by
a group of school psychologists and may modify these estimated target parameters accordingly.
North River Collaborative – DDM – Social/Emotional Assessment Data
9
Estimated Target Parameters
Target
Total Percentage
Low Performance
Moderate Performance
High Performance
0-69%
75-90%
90-100%
Piloting
Development Process
This measure for school psychologists was refined through both small trials and a pilot during May
2015. During initial development, school psychologists on the development team conducted small
tryouts of the survey tool at team meetings where social/emotional assessment results were
discussed and at school-wide staff meetings to get a range of opinions. These trials led to various
refinements of the measure’s items and descriptors, as well as to the scoring tools and
administration protocol. For example, the original survey contained four Likert-type response
categories, but the majority of individuals who previewed the survey felt that they would be more apt
to complete the survey if they only had two response choices. Additionally, wording on items was
changed given that some original items were considered “confusing.”
In addition, the assessment received multiple peer and expert reviews during initial development.
For example, staff at WestEd (Woburn, MA) critiqued the assessment task and associated scoring
materials, and an ESE review team also provided critical feedback, which led to further revisions.
For example, WestEd suggested that the survey tool clearly differentiate between questions about
oral communication and written communication. Also, WestEd’s feedback allowed us to think about
the importance of the timing for distributing the survey.
Finally, a longer term pilot of the assessment in the developers’ own schools and districts
contributed to the development of increasingly specific, feasible, and informative tools for potential
users. The DDM was piloted in the school districts of Whitman-Hanson Regional Schools, the
Wellesley Schools, and the Scituate Schools. The measure was piloted during the month of May
2015. The pilot was designed to answer the following questions:
Question 1: How clear and complete are each of the sections of the DDM to school
psychologists who were not part of the development process?
Question 2: Did teachers who completed the survey feel the directions on the survey tool were
easy to follow; was the survey tool user-friendly?
Question 3: What scores will the school psychologists receive from the survey?
The pilot process consisted of cognitive interviews with school psychologists, who distributed the
survey, and teachers, who completed the survey. The cognitive interviews with school psychologists
involved reading the DDM document to answer Question 1. Cognitive interviews with teachers who
completed the survey addressed Question 2.
North River Collaborative – DDM – Social/Emotional Assessment Data
10
In order to address Question 3 of the Pilot Plan, the survey was distributed by school psychologists
following special education TEAM meetings that involved the presentation of social/emotional
assessment results. The school psychologist presenting the assessment results was responsible for
giving the survey to teachers. School psychologists who were not members of the development
team returned the completed surveys to a specific development team member who was identified
as the contact person. That individual completed the scoring. The development team met in June to
analyze data, and made final revisions to the DDM in response to insights gained from this analysis.
Pilot Results
We analyzed the data to identify overall strengths of our DDM, as well as concerns or issues that
indicated a need for DDM revisions.
From Question 1, we learned that:
 Overall, psychologists felt the protocol was straightforward, clear, and easy to follow.
 The sample scoring template was a helpful guide to the school psychologists who
implemented it.
 There were concerns about the administration process, including:
o Which teachers would receive the survey – all teachers involved in social-emotional
evaluations within the distribution period or a selected few?
o Would teachers provide honest ratings if the survey was to be returned directly to the
person being evaluated?
In response to the questions raised through the pilot, some of the Administration Procedures were
either clarified or modified. Based on the feedback, we clarified that all teachers who attended Team
meetings that fit the criteria for this DDM, including a social/emotional assessment, would receive
the survey. To the Administration Protocol section, we added a suggested minimum sample size of
10 reports or 20% of total evaluation caseload that include social/emotional component, whichever
is greater.
To address the issue of anonymity, we clarified that teachers would turn in surveys to a 3 rd party.
We also developed a sample cover letter to accompany the survey that would: (1) explain the
purpose of the survey, (2) outline the process for survey return for the purposes of preserving
anonymity and increasing respondent willingness to participate, and ( 3) encourage respondent to
give honest feedback for the purposes of guiding the school psychologist’s practice.
From Question 2, we learned that:
 Overall, the survey was user-friendly and easy to use.
 A Likert scale was suggested in place of the “Agree or Disagree” format; the team felt
however, that given the nature of the data the original format should be retained. Because
these surveys are based on one case, a scale that measured frequency—i.e., never,
sometimes, often—would not be applicable. A scale that measured the level of respondent
agreement—i.e., somewhat agree, agree, somewhat disagree, disagree—would not be
appropriate. If any of the recommendations or results were unclear, the rating should then be
North River Collaborative – DDM – Social/Emotional Assessment Data
11
“disagree.” Clarification was added to the survey directions to inform teachers that “disagree”
is the appropriate response in such situations.
From Question 3, we learned that:
 The survey collection procedure involving the collection of surveys by the school
psychologist him/herself may inflate the ratings.
 The survey item stating “Provided useful clarification about my student’s social/emotional
assessment in response to other team members’ questions and/or comments” could allow
for a “not applicable” response, which would impact the results. The disclaimer, “If further
clarification was not needed, please check agree,” was added to survey item discussed
above. This was decided because it was assumed that the psychologist’s report and
meeting presentation was clear and therefore requiring no further clarification.
Pilot Survey Results for Seven School Psychologists
*Number next to each item is the % of Agreement (final column on the Summary Scoring Template)
PsychA
PsychB
PsychC
PsychD
PsychE
PsychF
Surveys Distributed
4
2
5
3
3
4
Surveys Completed
2
2
3
2
3
4
Percentage Completed
50%
100%
60%
67%
100%
100%
Assessment Data
Item AD1*
100%
100%
100%
100%
100%
100%
Item AD2*
100%
100%
100%
100%
100%
100%
Item AD3*
100%
100%
100%
100%
100%
100%
Item AD4*
100%
100%
100%
100%
100%
100%
Recommendations
Item R1*
100%
100%
100%
100%
100%
100%
Item R2*
100%
100%
100%
n/a
100%
100%
Item R3*
100%
100%
100%
100%
100%
100%
Item R4*
100%
100%
100%
100%
100%
100%
Average % Agree
Over All
100%
100%
100%
100%
100%
100%
PsychG
9
4
44%
100%
100%
100%
100%
100%
100%
100%
100%
100%
The observations noted above provided further reason to change the administration protocol
regarding the survey collection procedure as discussed in the explanation of changes related to
Question 1. After reviewing the data from the pilot, it is clear that a larger sample size is needed for
the school psychologist to draw conclusions and identify patterns. Furthermore, a larger sample size
will be more representative of the school psychologist’s practice.
The highly positive ratings on the pilot data provided further evidence of the need to preserve
anonymity of the respondents and encourage honest feedback. It is recommended that districts pilot
this DDM again with the new survey collection procedure and cover letter to see if the feedback is
more helpful in informing practice. If there does not seem to be a significant difference, the district
could consider adding a Likert scale to the survey.
North River Collaborative – DDM – Social/Emotional Assessment Data
12
A note about the growth model was added to the Measuring Growth and Setting Parameters section
suggesting that professionals new to a district might consider using a growth model versus to a
target model. In order to show growth, the new school psychologist could make a comparison of the
ratings in the beginning of the year with those at the end of the year. It would be expected that new
school psychologists would have higher ratings during the second half of the year.
Assessment Blueprint
This measure is aligned to the following Core Course Objective (CCO): Evaluators will
communicate, orally and in writing, students’ social/emotional testing data and recommendations in
a way that teachers perceive as clear, relevant, and practical. Please see additional information
about the development of this CCO in the Introduction on page 2.
The following Assessment Blueprint provides an overview of the design and challenge of the
assessment. In this assessment, all items are equally weighted. The Assessment Blueprint notes:
(1) the content being measured; (2) the percentage of the measure’s score that is devoted to each
aspect of this content; (3) whether the content is assessed as a growth measure or a target; and (4)
the relative difficulty of the items.
Assessment Blueprint - Indirect Measure
Step 1: Enter the total # of points you want to include in the assessment.
100 points
Step 4: Describe the types of items you want to develop in
Step 3: Confirm terms of cognitive complexity and item difficulty. The
point
corresponding number of items and points per item will be
allotments.
determined.
Step 2: Enter the relative weight you want to assign to covered content.
Objective
Item Difficulty
Weight
Content (Job Responsibility)
(% of Overall
Measure)
# of allotted
points
50%
Growth
Target
Easy Items
50
4
1
13%
12.5
1
1
25%
25
2
13%
100%
12.5
100 points
Moderate
Items
Hard Items
Step 5:. Confirm number of
items and average points
per item
# of items in
content
area
average
points per
item
4
12.5
1
12.5
2
12.5
1
8items
12.5
1. Evaluators communicate students’ social/emotional assessment data in ways that teachers perceive as
clear, both orally and in writing.
MSPA Standard I-C-2: School psychologists present key, relevant findings to colleagues clearly, respectfully,
and in sufficient detail to promote effective collaboration that supports improved student learning and/or
development
2
1
2. Evaluators will communicate social/emotional recommendations in written reports in ways that teachers
perceive as clear
MSPA Standard I-C-1: School psychologists skillfully interpret assessment findings and relate them to
educational performance and needs and to recommendations
3. Evaluators will communicate social/emotional recommendations in written reports in ways that teachers
perceive as relevant.
MSPA Standard I-C-1: School psychologists skillfully interpret assessment findings and relate them to
educational performance and needs and to recommendations
2
4. Evaluators will communicate social/emotional recommendations in written reports in ways that teachers
perceive as practical.
MSPA Standard I-C-1: School psychologists skillfully interpret assessment findings and relate them to
educational performance and needs and to recommendations
Totals
0 items
1
8 items
2 items
4 items
1
2 items
edit password: EDIT
North River Collaborative – DDM – Social/Emotional Assessment Data
13
DDM Summary Scoring Template
Item Analysis
1. Tally all categories after all the surveys have been returned.
2. Record the Total Disagree and Total Agree responses for each row.
3. Record the Total Number of Responses for each row. Remember, educators may skip some items, so
theses total numbers may vary across items.
4. Going across each row, divide the Total Agree number by the Total Number of Respondents to
determine the percentage of agreement. Record this percentage in the final column.
Overall Impact
1. Add the values in the Total Disagree column and record at the bottom of the column.
2. Add the values in the Total Agree column and record at the bottom of the column.
3. Add the values in the Total Number of Responses column and record at the bottom of the column.
4. Divide the Total Agree by the Total Number of Responses to determine an overall percentage of
responses that indicate favorable perceptions. Record in the final cell at the far, bottom right.
Total
Disagree
Assessment Data
Total
Agree
Total # of
Responses
Percentage of
Agree
(Total Agree/Total
Responses)
AD1. Explained the student’s
social/emotional assessment results
clearly in the written report.
AD 2. Orally explained my student’s
social/emotional assessment results
clearly in the team meeting.
AD 3. Orally explained the most critical
findings to help me understand the
student’s social emotional challenges.
AD 4. Provided useful clarification about
my student’s social/emotional
assessment in response to other team
members’ questions and/or comments.
Total
Disagree
Recommendations
Total
Agree
Total # of
Responses
Percentage of
Agree
(Total Agree/Total
Responses)
R1. Wrote clear recommendations.
R2. Provided a rationale for why the
recommendations were given.
R3. Provided recommendations that
address the student’s social/emotional
challenges.
R4. Provided recommendations that are
feasible given the resources available
within the school/classroom setting.
Total Scores
AVG % Agree
Overall
North River Collaborative – DDM – Social/Emotional Assessment Data
14
Sample Cover Letter
Date:_________
Dear Teacher,
The attached survey is for a District Determined Measure (DDM) to gain information about my practice as a
school psychologist. I am specifically looking for feedback to a presentation I made at a recent special
education team meeting which you attended involving social/emotional and behavioral assessment tools.
The DDM is designed to provide feedback related to my written and verbal communication of testing results
and recommendations. Any feedback you can provide—positive or negative—is welcome, as it will help to
improve my practice. I am available to discuss any questions you may have about the survey. Please
consider the constraints of a school-based setting when responding, and, if you are uncertain about a
response, please take the time to refer back to the student’s file.
Please return this to [a third party] by [date].
Thank you for your time,
[signed school psychologist]
North River Collaborative – DDM – Social/Emotional Assessment Data
15
Social/Emotional Assessment Survey
Please take 5-10 minutes to complete the following the survey regarding your student’s recent
social/emotional assessment and the recommendations offered in the report and presented in the
meeting. Please read each question carefully and put an “X” in the box under your corresponding
answer. Please complete all items. If any of the recommendations or results were unclear or
confusing, your response should be “disagree.” There is space at the end of this survey should you
want to clarify your responses or offer any additional comments.
Assessment Data
Based on my
student’s recent
social/emotional
evaluation,
the school
psychologist:
Agree
Disagree
Agree
1. Explained the student’s social/
emotional assessment results clearly
in the written report.
2. Orally explained my student’s social/
emotional assessment results clearly
in the team meeting.
3. Orally explained the most critical
findings to help me understand the
student’s social emotional challenges.
*4. Provided useful clarification about
my student’s social/emotional
assessment in response to other team
members’ questions and/or comments.*
Recommendations
Based on the
social/emotional
evaluation
findings,
the school
psychologist:
Disagree
1. Recommendations were written
clearly.
2. Provided rationale for why the
recommendations were given.
3. Provided recommendations that
address the student’s social/emotional
challenges.
**4. Provided recommendations that are
feasible given the resources available
within the school/classroom setting.
* Assessment Data question 4: Please check “agree” if further clarification was not needed because the presenter was
clear in his/her presentation.
** Recommendations question 4: When determining the appropriateness of the school psychologist’s recommendations,
consider whether the recommendations address the student’s language proficiency and other cultural factors that may
be relevant to ensuring the student’s success.
Comments:
North River Collaborative – DDM – Social/Emotional Assessment Data
16
Download