District-Determined Measure Example

Effective Use and Communication of Social/Emotional Assessment Data
Content Area and Grade Range: Social/Emotional Data, grades K-12
DDM Summary: This DDM assesses the ability of school psychologists, or other evaluators, to clearly explain and connect social/emotional assessment results and subsequent classroom recommendations through written reports and team meeting presentations.

Developed by: Colleen McDonald, School Psychologist (Whitman-Hanson Regional School District), Sarah Hargrove, School Psychologist (Whitman-Hanson Regional School District), and Tara-Jean Grabert, School Psychologist (Whitman-Hanson Regional School District)

Reviewed by: Sonya Meiran (ESE); Matt Hollaway (ESE); Craig Waterman (ESE)
Pilot Districts: Whitman-Hanson Regional School District, Wellesley Public Schools, Scituate Public Schools

Date updated: June 2015

Table of Contents

Introduction2

Instrument3

Administration Protocol3

Scoring Guide5

Measuring Growth and Setting Parameters9

Piloting10

Assessment Blueprint13


Introduction

This DDM is a target measure, rather than a growth measure, of the school psychologist’s indirect impact on student learning. Specifically, it consists of a survey designed to solicit teachers’ perceptions of the extent to which the school psychologist’s verbal and written communication of a student’s social/emotional assessment data has informed their understandings of a student’s difficulties and how these impact the student’s learning. Additionally, it asks about teachers’ understandings of the psychologist’s recommendations, based on the reported social/emotional assessment data, and their perceptions of whether these recommendations are feasible to implement in the school setting as a means to addressing the student’s social/emotional challenges. This brief survey consists of eight items that ask the respondents to indicate “agree” or “disagree” if the listed expectation was met either during the team meeting or through the school psychologist’s written report.

This assessment for evaluators is newly developed and has been refined through a series of small pilots with teachers, but has not been officially administered as a DDM. Districts are encouraged to adopt this DDM as designed, or refine and/or modify the survey, scoring template, and administration protocol to suit their unique needs and local circumstances. For example, although this tool was designed as a target measure, it can also be modified to serve as a measure of an evaluator’s growth over time. Additionally, while this DDM focuses on the assessment of students’ social/emotional skills, districts can modify it to apply to a variety of assessment results, including cognitive, academic achievement, speech/language, etc.

This measure is aligned to the following Core Course Objective (CCO): Evaluators will communicate, orally and in writing, students’ social/emotional testing data and recommendations in a way that teachers perceive as clear, relevant, and practical.

A CCO is a statement that describes core, essential, or high priority content—i.e., knowledge, skills, or abilities—identified by those who designed the assessment, which is drawn, synthesized, or composed from a larger set of curriculum or professional standards.

This CCO was identified as the basis for the DDM due to the increasing number of social/emotional evaluations that school psychologists must complete. Additionally, the school psychologists’ communication of social/emotional assessment data plays a crucial role in determining services and supports for students. To ensure that appropriate classroom services are delivered, teachers working directly with students with social/emotional difficulties must understand the evaluation data reported by the school psychologist. Teachers must also understand the recommendations that are described in the reports and team meetings. This DDM serves the purpose of assessing the school psychologist’s oral and written communication abilities when evaluating students with social/emotional difficulties. It allows teachers to indicate if the school psychologist’s written and oral reports are clear and support the teacher’s work with the student.

Content (Job Responsibility) / Weight /
1.  Evaluators communicate students’ social/emotional assessment data in ways that teachers perceive as clear, both orally and in writing.
MSPA Standard I-C-2: School psychologists present key, relevant findings to colleagues clearly, respectfully, and in sufficient detail to promote effective collaboration that supports improved student learning and/or development. / 50% of the measure
2.  Evaluators will communicate social/emotional recommendations in written reports in ways that teachers perceive as clear
MSPA Standard I-C-1: School psychologists skillfully interpret assessment findings and relate them to educational performance, needs, and recommendations. / 12.5% of the measure
3.  Evaluators will communicate social/emotional recommendations in written reports in ways that teachers perceive as relevant.
MSPA Standard I-C-1: School psychologists skillfully interpret assessment findings and relate them to educational performance, needs, and recommendations. / 25% of the measure
4.  Evaluators will communicate social/emotional recommendations in written reports in ways that teachers perceive as practical.
MSPA Standard I-C-1: School psychologists skillfully interpret assessment findings and relate them to educational performance, needs, and recommendations. / 12.5% of the measure
100%

Instrument

This assessment is a brief eight-item survey that asks the respondents to indicate “agree” or “disagree” if the listed expectation was met either during the team meeting or through the school psychologist’s written report. For example, ”[the school psychologist] Explained the student’s social/emotional assessment results clearly in the written report.” Four of the items measure the educator’s perception of the clarity of the school psychologist’s presentation of the student’s assessment data and four of the items measure perception of the clarity, feasibility, and relevance of the school psychologist’s recommendations. There is also an open-ended comments section should the rater wish to provide more specific feedback about his/her experience with the school psychologist’s report or presentation in the meeting.

Administration Protocol

This administration protocol is provided to increase the likelihood that the assessment is administered in a fair and consistent manner across all students, classrooms, and/or schools that use the same DDM and over time. Adherence to this protocol will reduce the variation in local decisions when administering the assessment; it will also increase the comparability of the data collected.

When is the measure administered?
The survey tool was developed to elicit feedback from teachers following a student evaluation that includes a social/emotional assessment. Throughout the school year, surveys are distributed to teachers’ mailboxes following all TEAM meetings that involve the discussion of social/emotional assessment results.

The school psychologist is expected to place a survey in a participating educator’s mailbox within 24 hours of the TEAM meeting. To ensure anonymity, an automatic email will be sent a week after distribution to all teachers who received the survey, even if the individual teacher has already returned the form. Those surveyed should be the individual(s) who are responsible for the implementation of the recommendations made by the school psychologist. In some instances, those surveyed will be the regular education classroom teacher; it is also appropriate for special education teachers to provide feedback if they are the staff primarily responsible for meeting the student’s needs. Those responsible for providing feedback must have received a copy of the evaluator’s written report and must have attended the TEAM meeting.

How is the measure administered?
The school psychologist should prepare for each assessment by printing a copy of the survey and cover letter prior to the TEAM meeting. In addition, it is beneficial to attaining a strong return on the survey if the psychologist can touch base informally with the participating educators. The purpose of that communication is to remind teachers that they will be receiving the DDM survey, what it is about, particularly referencing the psychologist’s report and recommendations, and how the results will be used.

The survey includes instructions for the educator to follow. The instructions read: Please take 5-10 minutes to complete the following survey regarding your student’s recent social/emotional assessment and the recommendations offered in the report and presented in the meeting. Please read each question carefully and put an “X” in the box under your corresponding answer. Please complete all items. If any of the recommendations or results were unclear or confusing then your response should be “disagree.” There is space at the end of this survey should you want to clarify your responses or offer any additional comments.

Additionally, the school psychologist provides a cover letter that: (1) explains the purpose of the survey, (2) outlines the process for survey return for the purposes of preserving anonymity and increasing respondent willingness to participate, and (3) encourages the respondent to give honest feedback for the purposes of guiding the school psychologist’s practice.

All teachers who attend Team meetings that fit the criteria for this DDM, which includes a social/emotional assessment, will receive the survey. It is suggested that the school psychologist have a minimum sample size of 10 surveys or 20% of total evaluation caseload involving social/emotional components, whichever is greater. The school psychologist notes a specific return date in a clear and visible place on the cover letter and places it in the teacher’s mailbox within 24 hours of the TEAM meeting. Teachers are asked to complete and return the survey to a designated third party within one week; this person will vary across schools/districts. The school psychologist will keep track of the number of surveys he/she had distributed and the number of surveys returned; it is not necessary, however, for teachers to identify themselves on the form. After one week, the school psychologist will send an automatic email reminder to all educators who received the survey encouraging participation. These steps are necessary to ensure a high return rate for these surveys.

For the administration of the survey, no modifications or accommodations are needed. Teachers should be encouraged, however, to consider the student’s language proficiency and other cultural factors when determining the appropriateness of evaluator’s communication and recommendations.

How are deviations to protocols addressed? Designating a neutral third party—i.e., the school secretary—as the individual who will receive the surveys should ensure a high rate of return. To better ensure confidentiality, SISPs may also wish to only collect the surveys three times per year so that the surveys would not be seen immediately after the team meeting in which they were distributed, but later on. Finally, a district using this as a growth measure may wish to tally results three times per year to monitor progress, rather than just at the conclusion of the year.

Scoring Guide

This assessment packet includes a Summary Scoring Template (below) to be used after the surveys have been collected. Two scores can be derived from this instrument. The first score (to the far right on the template) is a percentage of those respondents who responded “agree” on each item. The percentages calculated for the individual items can be used as feedback to guide the school psychologist’s future practice. The second score (bottom right corner) is an overall percentage of “agree” responses, which indicates the level of the school psychologist’s impact in the areas assessed in the survey tool. These percentages are later interpreted in relation to the Target Parameters to determine whether the school psychologist demonstrated low, moderate, or high impact on students using this DDM in this given year.

Scoring Process

Item Analysis

1.  Tally all categories after all the surveys have been returned.

2.  Record the Total Disagree and Total Agree responses for each row.

3.  Record the Total Number of Responses for each row. Remember, educators may skip some items, so these total numbers may vary across items.

4.  Going across each row, divide the Total Agree number by the Total Number of Respondents to determine the percentage of agreement. Record this percentage in the final column. These scores can be used as specific feedback to guide future practice.

Overall Impact

1.  Add the values in the Total Disagree column and record at the bottom of the column.

2.  Add the values in the Total Agree column and record at the bottom of the column.

3.  Add the values in the Total Number of Responses column and record at the bottom of the column.

4.  Divide the Total Number of Responses by the Total Agree to determine an overall percentage of responses that indicate favorable perceptions. Record the percentage in the final cell at the far, bottom right of the table. This score can be used to determine the school’s psychologist’s level of impact. (See Target Parameters section.)

Who should score the assessment? Designating the School Psychologist should administer the survey, collect it from the third party, and tabulate the scores.

How should scorers prepare for scoring? The school psychologist should have all survey responses as well as a blank Summary Scoring Template in preparation for scoring.

DDM Summary Scoring Template

Item Analysis

1.  After all the surveys have been returned, tally all categories.

2.  Record the Total Disagree and Total Agree responses for each row.

3.  Record the Total Number of Responses for each row. (Remember, educators may skip some items, so theses total numbers may vary across items.)

4.  Going across each row, divide the Total Agree number by the Total Number of Respondents to determine the percentage of agreement. Record this percentage in the final column.

Overall Impact

1.  Add the values in the Total Disagree column and record at the bottom of the column.

2.  Add the values in the Total Agree column and record at the bottom of the column.

3.  Add the values in the Total Number of Responses column and record at the bottom of the column.

4.  Divide the Total Agree by the Total Number of Responses to determine an overall percentage of responses that indicate favorable perceptions. Record in the final cell at the far, bottom right.

Assessment Data / Total
Disagree / Total
Agree / Total # of Responses / Percentage of Agree
(Total Agree/Total Responses)
1. Explained the student’s social/emotional assessment results clearly in the written report.
2. Orally explained my student’s social/emotional assessment results clearly in the team meeting.
3. Orally explained the most critical findings to help me understand the student’s social emotional challenges.
4. Provided useful clarification about my student’s social/emotional assessment in response to other team members’ questions and/or comments.
Recommendations / Total
Disagree / Total
Agree / Total # of Responses / Percentage of Agree
(Total Agree/Total Responses)
1. Wrote clear recommendations.
2. Provided rationale for why the recommendations were given.
3. Provided recommendations that address the student’s social/emotional challenges.
4. Provided recommendations that are feasible given the resources available within the school/classroom setting.
Total Scores / AVG % Agree Overall

How should Gain or Growth Scores be calculated?
This DDM is measuring an essential job function/responsibility. The design team indicated that these functions are currently largely achieved in their districts, so their responsibility is to maintain these high standards. As a result this DDM is designed as a target measure. If a district needs to build capacity to achieve these functions, however, this DDM can be modified to serve as a growth measure. This may be particularly useful for individuals new to the district and/or new to the role. The school psychologist’s goal is for his or her Average % Agree Overall score (bottom, far right on the template) to fall within the Moderate or High Target range, as specified in the Target Parameters. (See Target Parameters section, below.)