2003-04-0Fficers: Jonathan Dings, President Connie Zumpf , Vice President Mike Kirby

2003-04-0Fficers: Jonathan Dings, President Connie Zumpf , Vice President Mike Kirby

2015-2016 Officers:

Heather MacGillivary, President

Mya Martin-Glenn, Vice President

Lori Benton, Secretary/Treasurer

February 19, 2016 8:30 a.m. - noon

*********Meeting Agenda*********

Hosted by Adams 12 Public School District

Socializing (8:30)

1) Welcome, Introductions, ACEE Business

  • Treasurer’s Report - $2,361.32 with 76 paid members (pay your dues!!!)
  • ACEE Suggested topics for End of Year – March 11 (DPS or metro area) cancel April meeting
  • March and May topics: SPF 2.0 release, legislative update, ESSA, data privacy, social emotional learning, graduation requirements, competency based graduation, academic tenacity (BVSD), climate and culture measures, parent surveys, employee surveys, debrief on testing, WIDA ACCESS debrief
  • Nominations for 2016-17 Secretary – put on your radar for nominations for next meeting

2) ACT/PACT – SAT/PSAT – gather questions for CDE FAQ for Will Morton

  • ACT/SAT:
  • What is the alignment between the ACT and SAT?
  • What will the linking procedures from ACT to SAT look like?
  • What will CDE have additional resources for both ACT? PSAT/SAT?
  • How will CDE support districts and stakeholders throughout the state to embrace the change from ACT to SAT? Communication…
  • Correlation and predication among student performance on tests for CMAS 9th to PSAT 10 to SAT
  • Could use additional communication help especially around the change
  • SAT:
  • Why?
  • Given the voucher for the writing, how will we manage these students who will attend that day? Is there a make up for writing?
  • Transition to SAT – is the voucher process the same?
  • What evidence does CDE have that the SAT has been sufficiently revised so the previous issues of bias have been addressed?
  • How can we get National Merit Qualification with spring PSAT testing?
  • How can we assure that the test is improved to reduce bias?
  • What support/assessment do you suggest/support for pregrade 8?
  • How are district handling transcripts?
  • Will writing always be optional?
  • How will opt-outs be handled, if SAT taken privately?
  • How do you calm parents’ concerns around loading student data into Khan Academy?
  • What is longitudinal plan for comparisons when no historical data?
  • Without science…how can we measure student performance…if utilizing writing assessment what science standards?
  • How will DACs be trained in the data analysis: use of concordance tables? PSAT 2016 to SAT 2017?
  • Will accommodations shift to DAC for PSAT/SAT?
  • How can DACs verify that testing materials will be sent to schools including accommodated materials?
  • Is the plan to stay with paper or move to online?
  • When are trainings for interpreting PSAT and SAT results? Including for differentiated groups such as principals, teachers, students, district, parents
  • When will CDE update the webpage to reflect the SAT info? (Still says under RFP…)
  • What is the plan for connecting SAT with AP resources?
  • What system will be put in place to tell schools where students are testing? (at school or at testing center with voucher)
  • Can writing portion of SAT be given at schools rather than vouchers? Testing center administration
  • How can we connect online test prep in a systemic way? Connect students and teachers
  • How confident are we in the new SAT since it has not been tested?
  • How comparable is the old SAT score to the new SAT?
  • How do we support our students in their readiness to take the tests, both paper and computer?
  • We are struggling to help train staff to prepare students…support?
  • Will SAT eliminate the need for 9th grade PARCC and PSAT under ESSA?
  • Are we getting a crosswalk from ACT to SAT? Need this ASAP…
  • PSAT
  • What is the estimated data timeline download from portal including file layouts and interpretation resources?
  • When will we get the data file?
  • What will be the file structure?
  • What is the role of the DAC for the PSAT/SAT?
  • Based on PSAT rep-information what is recommendation for transition for higher ed?
  • The College Board rep mentioned students being able to receive vouchers for fall PSAT/NMQT and additional 2 SATs for low income students. Could we get more info on this?
  • ACT:
  • What is the file structure?
  • When we will get the file?
  • ASPIRE:
  • Report request for historical test on same reports.
  • When will concordance tables be available for SAT/ACT and PSAT/ACT Aspire?

3) CDE Percentile Report – Alyssa Pearce and Josh Purdue

  • Data is not intended to be used for publicity
  • Guidance Document:
  • Percentile Rank reports are located on the Schoolview Performance website:
  • Questions included:
  • How can the range of the percentile rank for grade level not include the aggregate?
  • What about AECs? (These schools are excluded as well as facility schools)
  • Was 11th grade included? (No, only grades 3-10. It is inclusive of the content tests by grade level.)
  • What about participation rates? (Be cautious about interpretation related to participation)
  • What is the unit of analysis? (means by school)
  • What about the separate student percentile rank? (There are two reports, one by school with percentile rank comparisons and one by student score.)
  • Since parents can access them, are there talking points for the community? (CDE will work on that)
  • What was the thinking about subgroups being compared to all students? (TAP recommended to both, but for this report it seemed to be confusing. IT felt disingenuous to compare performance for students with disabilities to just their group. CDE will work on comparisons of like students. Cautions about only comparing the like groups since the target is a certain level of mastery for all students.)
  • Could we add information to the limitations document considering the grade level assessments versus content based tests and cautioning the participation rates with comparisons of performance?
  • What about consideration of mode, CBT versus PBT? (CDE did not see the same issues with the difference between the online versus paper. Even though a higher percentage of higher performing districts gave paper so difficult to know if related to mode or district, but it is worth CDE following up on this.)
  • What was the spread of PBT versus CBT?
  • Is a unique report? Or will it go on forever? (Built it more for just this year to provide information to get used to the new assessment. Preparing for state school board and cut scores for new SPF).
  • Can you explain the mean scale score for TCAP versus how the SPF was percentile rank comparison of percent proficient and advanced? There is a difference between the percentile rank of the mean score of TCAP scale score with percentile rank comparison to other schools based on the scale score not the percentage of proficient and advanced. If folks used this in their UIP, contact Erin at CDE if you have additional questions about targets that might have been set using the SPF percentile ranks since this is not the metric portrayed on the report.
  • What about being able to recreate the calculations done? Especially at the student level? (Fairly complicated file, so CDE had to standardize it. People would like these files. CDE will figure out how to get this data out to districts)
  • SPF/DPF 2.0
  • Mock ups should be coming out in March.
  • Big things that will be different:
  • Achievement: Aggregate group and points assigned for aggregate group and super disaggregated group. Disaggregated groups will be listed but points not assigned
  • Also under Achievement: READ ACT – performance of 3rd and 4th graders previously SRD while K-3, points for performance on ELA will be bonus points only and will have to be translated from Partially Proficient or Above as currently written in the law
  • Growth – no AGPs only MGPs; ELA, Math, English Language Proficiency growth for points. Aggregate group and super disaggregated group for ELA and Math receive points. Disaggregated groups will be listed but points not assigned
  • PWR – same indicators and matriculation rate (postsecondary enrollment or certificate with most recent 4-year cohort) will receive points
  • Will not provide an overall rating for this year, really intended for information not accountability
  • Transitional growth reports will most likely be sent through Syncplicity, questions asked about whether to include these on the mockups or not…
  • Will be invited back for March or May to help go over them with ACEE
  • What about the qualitative measure for ESSA – implemented fall of 2017, so we have time to work through this, maybe have a brainstorming session for the afternoon of one of the next meetings?

4) Educator Effectiveness Overview of Measures of Student Learning from Across Colorado (10:45 – 11:30) - Gilad Wilkensen, Sarah Duran, Slope Research

  • PPT shared related to general study findings from last year. Will be posted on ACEE
  • Info also on their website:
  • Conclusions
  • Teachers were more concerned with performance as opposed to system details
  • Concerns still remain about implementation challenges such as comparability and fairness
  • SLO use increased for both individual and collective attribution
  • District still need more support and resources for implementing SLOs effectively
  • District staff reported engaging teachers as a key priority in ongoing implementation and improvement
  • Additional conclusions on the PPT
  • Resources made available:
  • Guiding questions for MSL Systems
  • 4 illustrative systems
  • Full report

  • Can participate in the study. Additional info about participation on the website.
  • Intent of research is for CEI to provide support to districts.

5) External Research in School Districts – Facilitated Discussion

  • Questions that help guide the discussion:
  • How do school districts handle the many, multiple data requests from outside organizations (universities, vendors, community members, etc.)?
  • How are school districts processing and reviewing outside research requests?
  • How have districts changed their approaches in light of recent student privacy concerns?
  • Do we have the same request happening across districts? What can we do to streamline those requests?
  • Strategies districts use:
  • MOUs with Universities related to IRB processes
  • Payment for data pulls for the time it takes ($50/hour was one example) for external people
  • Implementation of research windows
  • Use of applications and processes
  • CORA requests are for what is readily available ($30/hour after the first hour)
  • Find district sponsor
  • Is there a need to make sure legislators understand the current state of research and CORA requests and the potential misunderstandings and limitations related to potential legislation? What about a white paper related to workload, financial burden, and mutual benefit? (Paul M., Julie O., and Heather M.)
  • We will keep the paper short, but need to address it. Will send it out via Google Docs. Please read and provide feedback.
  • Research should be about mutual benefit. What about a research consortium that defines the scope and research for a group of districts that then partners with Universities.
  • Need to share requests? Forum for sharing?

Next Meeting March 11, 2016

Denver Public Schools

*****No meeting in April*****