Accountability & Assistance Advisory Council (AAAC) Meeting Notes
October 11, 2017 (9:30am – 12:30pm)
Best Western Marlborough (181 Boston Post Road, Marlborough)

AAAC members in attendance:Ethan Cancell, Hardin Coleman, Jason DeFalco, Sam DePina, Ranjini Govender, David Krane, Meg May-Brown, Kathryn McDermott,Paul Schlichtman, Mary Skipper, Elizabeth Tripathi

ESE staff in attendance: Robert Curtin, Erica Gonzales, Russell Johnston, David Parker, Rebecca Shor, Joan Tuttle

Massachusetts Department of Elementary and Secondary Education / 1

Robert Curtin, Associate Commissioner of Data and Accountability at the Department of Elementary and Secondary Education (ESE), convened the meetingby welcoming Council members. Meg Mayo-Brown, Council Co-Chair, introduced the new co-chair leadership model for the Council and reviewed the Council’s norms and protocol for discussion. Mr. Curtin reviewed 2017 assessment and accountability reporting information followed by an update on the submission and approval of Massachusetts’ Every Student Succeeds Act (ESSA) state plan. Finally, Russell Johnston (Senior Associate Commissioner) and staff from the Center for District Support (Joan Tuttle, David Parker, and Rebecca Shor) facilitated a discussion about current and future assistance efforts in Massachusetts. The following notes were recorded during the whole-group discussion between Council members, and a copy of the slide presentation can be found at

Review of 2017 assessment and accountability reporting

  • Rob Curtin reviewed the April 2017 amendment to state accountability regulations relating to 2017 accountability determinations for districts and schools that administered the Next-Generation MCAS tests.Schools and districts that administered the new tests in grades 3 through 8 will receive an accountability determination of “no level,” so long as participation and graduation rate requirements are met. Accountability determinations will remain the same for high schools that did not administer the Next-Generation MCAS tests.
  • One Council member sought clarification on the purpose of the amendment. Mr. Curtin noted that the regulation changes were put into place due to the assessment transition, not because of the state’s approved ESSA state plan.
  • A Council member asked how accountability determinations will be made for districts with one or more Level 4 schools. Mr. Curtin explained that the amendment will apply to the district, and that the district will not be assigned an accountability and assistance level so long as it maintains at least 90 percent participation for each subgroup.
  • Mr. Curtin described the assessment data that will be available to districts, schools, and the public. In addition to achievement results, schools will receive achievement percentiles, which provide information about how the school is performing compared to other schools that administered the Next-Generation MCAS tests. He cautioned against making year-to-year comparisons when reviewing 2016 MCAS and 2017 Next-Generation MCAS achievement results, but noted that the Student Growth Percentile (SGP) can be used to make comparisons across years.
  • One Council member asked what the minimum group size is for reporting assessment and accountability data. Mr. Curtin explained that for assessment results, the minimum group size is 10 students and that for accountability data the minimum groups size is 20 students.
  • A Council member asked if data will be reported in the same place. Mr. Curtin noted that assessment reports will be the same, with small changes to reflect the fact that achievement will be reported differently for Next-Generation MCAS tests (e.g., achievement percentiles and average scale scores instead of Composite Performance Index (CPI) data).
  • One member asked if there is a criterion-referenced component for assessment result that signals being ready for the next level (e.g., college or career). Mr. Curtin explained that the new MCAS achievement level descriptors focus on preparedness for success at the next level (e.g., expectations of next grade, etc.). The test, the standards that are measured, and the achievement levels are different from the legacy test. He also noted that standard-setting for the Next-Generation MCAS test was done by educators and other experts, not ESE staff.

Update on Massachusetts’ ESSA state plan

  • Mr. Curtin reviewed the approval process for Massachusetts’ ESSA state plan. The plan was approved in September after ongoing discussions with the United Stated Department of Education. He provided an update on the revisions that were made to plan in order to gain approval, specifically noting changes related to the use of average scale score data in accountability determinations and the classification of small schools.
  • Related to the use of average scale score:
  • One Council member asked how ESE plans to reportand message the achievement value.Mr. Curtin noted that ESE will need to report a number of different data points that go into the accountability calculations, and must do so in a way that is not confusing.
  • Another Council member asked when the new accountability calculations will go into effect. Mr. Curtin stated that this will occur in the fall of 2018.
  • Another member asked if the use of average scale score will exist as part of the state plan, or if it will be written into regulation as well. Mr. Curtin noted that state accountability regulations likely will notinclude detailed calculations, but that some changes will need to be made to reflect our approved plan.
  • Related to the classification of small schools, one member noted that many alternative high schools fit into this category and asked how determinations will be made for these schools. Mr. Curtin noted that ESE is interested in pursuing flexibility for classifying these schools.

Gap-closing

  • Mr. Curtin introduced an analysis of gap-closing in Massachusetts using historical MCAS data. For the purposes of planning the new accountability system, ESE is looking at whether schools are narrowing gaps between student groups. Early thinking in this area included using the high needs group to measure gap-closing. However, not all schools and districts have distinct high needs and non-high needs groups in all years. Additional analysis of gap-closing between high- and low-performing groups highlighted different issues: comparisons between race/ethnicity and need (e.g., white students versus students with disabilities); comparisons between non-similar groups across schools (e.g., white students versus students with disabilities in one school, and Asian students versus Hispanic/Latino students in another school); and gap-closing as a result of a decline in performance by the high-performing group. ESE’s present thinking focuses on measuring the improvement of the lowest-performing students in every school and district. The theory behind this approach is that if every school raises the performance of their lowest-performing students (“raising the floor”), gaps would close. Simulations were done using annual comparisons and by following the performance of cohorts of students across years.
  • One Council member noted that the use of a cohort model is preferred. They also suggested doing this analysis using data from an urban district, not just statewide data. This would highlight disproportionality, if any, and would determine whether this approach works for groupings of schools (e.g., urban, rural, suburban), not just the state as a whole. To this, another member noted that the cohort model should work for urban systems, showing that when students remain in one school or district for a number of years, they perform well. Mr. Curtin acknowledged this, but cautioned against developing an accountability system that reflects only the performance of students who are stable.
  • Another member noted that that cohort model for measuring improvement would address the frequent concern in large urban districts that the accountability system does not capture the targeted work that they do with low-performing students.
  • One member asked if change would be measured by using assessment scale scores. Mr. Curtin noted that it could be done a number of different ways, but the hope would be to move toward using average scale score as a measure of achievement.
  • One member asked if any analysis would be done related to the performance of the top 25 percent of students. They expressed the concern that it could looking like gaps are closing if the top-performing group has a decline in performance. To this, another Council member wondered if ESE would define a top 25 percent threshold as a goal for that low-performing groups. Mr. Curtin explained that improving the performance of the lowest quartile of students would be used in combination with overall school performance.
  • Another Council member noted that beyond the analysis or reporting of data, school improvement practices should be reviewed. Schools need to make sure that practices put into place to address the needs of the lowest-performing students do not negatively affect the other 75 percent of students in the school.
  • One Council member asked about the weighting of gap-closing measures in the overall accountability calculation. Mr. Curtin stated that weighting have not been discussed or finalized yet.
  • One member asked if the lowest quartile would be a subgroup. Mr. Curtin noted that ESE would continue to report on all subgroups, as required by in law. This would be a component of our system, but ESE will still have to identify schools with any low-performing subgroups.
  • Another member asked if other states are doing anything similar. Mr. Curtin said that the cohortmodel appears to be unique to our state, but that others are looking at change over time among low-performing students as a group.
  • One Council member suggested that ESE move away from the language of “gap-closing.” Theylike the idea of “raising the floor.”
  • One member asked how this measure is different from the Composite Performance Index (CPI). Mr. Curtin noted that ESE intends to move away from CPI because it does not differentiate enough between levels of performance, especially at the higher end of the achievement level scale.
  • One member asked why the analysis looked at the lowest 25 percent and not a larger percentage. Mr. Curtin explained that the intention was to make sure the lowest-performing group was different from the “all students” group in terms of number of students.
  • A Council member stated that this new group may help capture low-performing students that belong to other subgroups that are too small for reporting.
  • Potential issues with this measure of gap-closing that were identified during discussion:
  • The lowest quartile in a school could be below the minimum group size in a small schools;
  • Measuring the cohort in science does not work because science is only tested in grades 5, 8, and 10;
  • High schools only administer the MCAS in one grade, so cohort change could not be measured (a similar concern was expressed for K-3 schools); and
  • Schools might not worry about the last grade in the school, because those students have matriculated to another school by the time the results are reported.

Assistance strategies in Massachusetts

  • Dr. Johnston gave a general overview of the assistance provided by ESE’s Center for District Support, setting context for the assistance work out of five offices within the center. Joan Tuttle reviewed the current assistanceefforts that come out of the Office of District and School Turnaround (ODST), followed by David Parker’s explanation of the assistance provided by the District and School Assistance Centers (DSAC). Within these offices, the focus of the work is on the lowest 10 percent of schools. Additional networks serve schools in percentiles 11-20. Finally, Rebecca Shordescribed initiatives across ODST and DSAC (e.g., Tiered Systems of Support, Wraparound Zones, Focus Academies, and partners). These assistance focus areas are aligned to priority, high-leverage turnaround practices.
  • When asked what resources, supports, and/or practices can ESE leverage to enhance assistance focus areas, Council members provided the following feedback:
  • Supporting district-level organizational management;
  • Encouraging self-reflectionrelated to systems and structures that perpetuate school performance (e.g., enrollment, student assignment, programming for special populations, substitute availability, etc.);
  • Helping schools and staff effectively balance time and responsibilities;
  • Understanding how turnaround and other school improvement strategies impact staffing;
  • Training leaders to manage partners well and involve them consistently;
  • Purposefully distributing leadership responsibilities, and eliminating redundancies where they exist;
  • Clearly articulating to districtsthe expectations around staffing and/or resources that are needed to comply with mandates; and
  • Creating connections and networks for people with job-alike functions (middle managers and leaders).

Next steps

  • ESE will model new accountability calculations the fall and share them with the Council at the December meeting.
  • ESE will maintain a focus on accountability and assistance in future Council discussions.

Massachusetts Department of Elementary and Secondary Education / 1