The Pennsylvania Department of Education

Proposal to the US Department of Education for Participation in the

No Child Left Behind (NCLB) Growth Model Pilot Program

October 15, 2008

Submitted by:

Gerald L. Zahorchak, D.Ed.

Secretary of Education

Commonwealth of Pennsylvania

Table of Contents

Executive Summary

Table 1: 2007 and 2008 PSSA Subgroup Data Analysis

Using Growth Model (PVAAS Projections) as an Additional Way to Make AYP

Table 2: Distribution of Targets

Table 3: Frequency of Targets

C1 – Abstract

Table 4: Present and Projected Grades

C2 – State’s Capacity regarding its Data, Assessment and Accountability System

Table 5: History of State Assessment

State Data Infrastructure

Experience in analyzing longitudinal data on student performance

Match Rates and Analyses

Table 6: School Year 2008 Merge Rates: Overall for Grades 4-8

Table 7: School Year 2008 Merge Rates: By Grade, Grades 4-8

Table 8: Merge Rates that Improved from 2007 to 2008 by at Least or Greater than 1%

Current Accountability System

Table 9: AYP Targets

Student Participation Rates

Table 10: 2008 AYP Statewide Participation Rates

Growth-based Accountability

Assessment Quality, Data Systems and Growth

PSSA Test Validity and Reliability

PSSA Calibration

PSSA Content Validity

PSSA Reliability

PSSA Reliability Indices

PSSA Reliability of Performance Levels

PSSA Validity

PSSA Construct Validity

Alternate Assessments

C3 – Model Description

Pennsylvania Growth Model Amendment Scheme

Validity Expectations

Table 11. Relationship Between Projected Scores and Later Observed Scores

# of Schools that Would Make AYP with Growth Model:

Executive Summary

Pennsylvania’s proposed expanded accountability system contains multiple ways for schools and districts to demonstrate Adequate Yearly Progress (AYP). This model recognizes that not all schools are alike and that there are multiple ways for schools to not only demonstrate their success of reaching proficiency but demonstrate their progress towards bringing all students to proficiency. As such, Pennsylvania schools are starting in different places, yet Pennsylvania’s expectation is that all students (100%) will reach proficiency or beyond by 2014. Therefore, there is a need to offer multiple ways for students, schools and districts to demonstrate that they are working toward the goal of 100% proficiency by 2014 (see Appendix A).

The multiple methods used in Pennsylvania’s accountability plan can be categorized as status measures, improvement measures, and growth measures. These three groups of measures highlight positive aspects of achievement and growth in different ways.

Status measures (including status with and without a confidence interval and two year averaging with and without a confidence interval), recognize schools and districts that have reached or exceeded the performance targets. These measures are not sensitive to the growth of students below proficient, nor do they take into account a decline in achievement levels, as long as a sufficient percentage of students are scoring proficient. Provided that schools and districts are reaching the target with these status measures, they are not, nor should they be, penalized for not moving all students toward proficiency.

The improvement measures recognize schools and districts that may not have reached the performance targets via status measures, but are showing some movement of students either across the proficiency line or toward the proficiency line. While NCLB allows the safe harbor method as a means of making AYP, this particular improvement measure does not hold to the core principle of 100% proficient by 2014. Schools and districts with small numbers of proficient students in 2002 can make AYP every year and still not have achieved 100% proficiency by 2014. The safe harbor provision of NCLB has several advantages in that (1) it does in fact acknowledge improvement in groups of students (although different groups of students) from one year to the next and (2) it allows for different improvement targets for each measurable subgroup, thus taking into account a different starting point for each group.

Pennsylvania’s Performance Index (PPI) which was approved by USDOE in 2005 is another improvement measure. PPI is similar to safe harbor in that improvement towards proficiency of different groups of students is rewarded. PPI also sets different improvement targets for each measurable subgroup, again, taking into account the reality of a different starting point for each group. PPI differs from safe harbor in two ways. First, PPI holds all schools, districts and all subgroups to 100% proficiency by 2014. Appendix B contains a complete explanation of how PPI is calculated and targets are set. The second difference between PPI and safe harbor is that PPI acknowledges improvement in groups of students below proficiency without the condition of a specific degree of improvement in the percent of students reaching proficiency. This allows all schools and districts to utilize PPI from their unique baseline of student performance.

Analysis of 2007 and 2008 Pennsylvania System of School Assessment (PSSA) data indicates that different subgroups are able to make adequate yearly progress (AYP) utilizing these existing methods. (See Table 1.)

This analysis indicates that while a large percentage of students are reaching the Annual Measurable Objective (AMO) targets, there are some unique and distinct groups that benefit from the use of the improvement targets – namely safe harbor and PPI. The IEP subgroup and the economically disadvantaged subgroup are benefiting from the use of these improvement measures. This analysis shows the importance of permitting multiple methods to make AYP.

Table 1: 2007 and 2008 PSSA Subgroup Data Analysis

Because of the increased accountability of NCLB which recognizes the importance of holding all students to high standards, schools with diverse student populations are held to meeting more targets than non-diverse schools. Pennsylvania has a wide range of demographics and as such, we have schools with five targets: attendance or graduation; participation in reading; participation in math; performance in reading; and, performance in math. In addition, we have schools with 41 targets: attendance or graduation; participation in reading for all students and for each of the nine possible subgroups; participation in math for all students and for each of the nine possible subgroups; performance in reading for all students and for each of the nine possible subgroups; and, performance in math for all students and for each of the nine possible subgroups. With 41 possible targets for a school and 122 possible targets for a district (see Appendix C for a list of possible targets at the district and school level), Pennsylvania will continue to urge USDE to allow maximum flexibility to states in meeting those multiple targets.

Pennsylvania is proposing adding an eighth method for schools to demonstrate AYP. The method proposed is the projection to proficiency metric. Analysis of the 2008 data reveals that this method, as expected, is showing to be extremely beneficial to the economically disadvantaged subgroup.

Using Growth Model (PVAAS Projections) as an Additional Way to Make AYP

Based on 2008 PVAAS projection to proficiency data, 242 additional schools would have made AYP. Some of these schools did not make AYP because of only one (1) target, while others missed making AYP by several targets.Data in Table 2 show the distribution of the number of targets in which the use of the growth model overturned the status of each target. For example, 150 more schools made AYP because of one target that met AYP due to the growth model being applied as an additional way to make AYP. The other 92 schools had two (2) or more targets per school that enabled them to make AYP due to the growth model being applied as an additional way to make AYP.

Table 2: Distribution of Targets

Targets Overturned / Number of Schools / Percentage of Schools
One (1) / 150 / 62.0%
Two (2) / 45 / 18.6%
Three (3) / 30 / 12.4%
Four (4) / 9 / 3.7%
Five (5) / 1 / 0.4%
Six (6) / 7 / 2.9%

Based only on the 242 additional schools that would have made AYP overall with the use of the growth model, the frequency of targets that were not originally met using the existing seven (7) provisions but would have been overturned by the growth model was investigated.Table 3 displays the frequency of the targets that were overturned based only on schools in which all of their unmet AYP targets were overturned. This ranged from one(1) to (6) targets. (For example, a school that had one target overturned but another target that was not overturned with the growth model was not included in this analysis.)

Table 3: Frequency of Targets

Math / Reading
All Students / 12 / 57
White / 1 / 15
Black / 15 / 51
Latino/Hispanic / 3 / 14
Asian/Pacific Islander / 0 / 0
American Indian/Alaskan Native / 0 / 0
Multiracial/ethnic / 0 / 0
IEP / 27 / 46
LEP / 0 / 0
Economically Disadvantaged / 22 / 150

The Pennsylvania Department of Education

Proposal to the US Department of Education for Participation in the

No Child Left Behind (NCLB) Growth Model Pilot Program

Proposal

C1 – Abstract

Pennsylvania proposes to apply an individual student “projection to proficiency” metric as the growth model for schools and districts to meet AYP in addition to the methods presently defined and accepted in the USDOE approved Pennsylvania Accountability Workbook. This proposal requests the inclusion of a projection to proficiency/growth metric – NOT a value-added metric – to yield a longitudinal analysis of student achievement data for determination of AYP status. This will recognize schools in which students have not yet achieved proficiency but are demonstrating significant growth towards proficiency in a time frame aligned to Pennsylvania’s AMO targets.

Under the proposed accountability system, districts, schools and subgroups will have three options for meeting AYP proficiency targets in reading and math: (1) status, (2) improvement, or (3) growth, i.e. projection to proficiency. Pennsylvania proposes to use the projection to proficiency metric of PVAAS to estimate the score of a particular student on a future state assessment that will then be used as part of the AYP determination. Using all available achievement data on the students, the projection calculation estimates a student’s performance on a future assessment based on the student’s test performance history, the histories of students with similar performance patterns and the school that student is most likely to attend at that projected grade. The individual student projection data will be used to recalculate the percent of students by district, school, subgroup and subject areas who are projected to attain proficiency on a future Pennsylvania System of School Assessment (PSSA) in the subject and targeted group where they did not make AYP via status or improvement measures. The sequence of projections is specified in Table 4:

Table 4: Present and Projected Grades

Present Grade / 3 / 4 / 5 / 6 / 7 / 8 / 11
Projected to Proficiency in Grade / Use Actual Grade 3 Score / Project to 6 / Project to 7 / Project
to 8 / Project to 8 / Project to 11 / Use Actual Grade 11 Score

The grades chosen for the projection to proficiency are based on the varied school configurations presently utilized in Pennsylvania. The current-year/actual scores of 3rd grade students and students new to the State will be used in the projection to proficiency model as they do not have adequate data to yield a projection. The State will use current-year/actual scores of Grade 11 students. Additionally, the state will use current year/actual scores of students with significant cognitive disabilities assessed under the Pennsylvania Alternate System of Assessment (PASA). For any student that does not have adequate longitudinal data to yield a projection, the actual/observed score will be used. Therefore, the data used in the growth model includes all of the students and all of the statewide assessments that are administered in Pennsylvania.

C2 – State’s Capacity regarding its Data, Assessment and Accountability System

The foundation of any growth-based accountability model is longitudinal assessment data. Sufficient state assessment data has been collected by Pennsylvania to calculate projections to proficiencies in all of the grades listed in Table 4. Pennsylvania has a complete and robust longitudinal database of assessment data for this analysis. Table 5 displays the history of collection of state assessment data in reading and mathematics in Pennsylvania since academic year 2003-04:

Table 5: History of State Assessment

Year / 3rd / 4th / 5th / 6th / 7th / 8th / 11th
2003-04 / X / X / X / X
2004-05 / X / X / X / X
2005-06 / X / X / X / X / X / X / X
2006-07 / X / X / X / X / X / X / X
2007-08 / X / X / X / X / X / X / X
2008-09 / X / X / X / X / X / X / X

State Data Infrastructure

In November 2005 the Pennsylvania Department of Education (PDE) was selected as one of 13 states to receive the first round of statewide longitudinal data system (SLDS) grant awards. Pennsylvania’s journey to build a SLDS started with the assignment of unique student and staff identifiers in FY 2007. PAsecureIDs (based on eScholar’s UniqID for Students) were assigned to all 1.8 million public school students and Professional Personnel Identifiers (PPIDs) were assigned to all teachers and certificated staff. In that same year, the PAsecureIDs were provided to PDE’s assessment vendor to begin capturing longitudinal assessment data for the school year 2006-07.

These initiatives were necessary prerequisites to the implementation of the Pennsylvania Information Management System (PIMS), PDE’s data collection, data warehouse, and reporting system.

The SLDS funds along with state appropriations were used to initiate the implementation of PIMS. PIMS consists of the eScholar Data Manager, eScholar Complete Data Warehouse, and Cognos Business Intelligence Suite. The PIMS solution also includes the ability to assign PAsecureIDs and report data to PIMS using the Schools Interoperability Framework (SIF). SIF is being used by more and more Pennsylvania LEAs to integrate their data, both horizontally across the LEA and vertically reporting to the state.

Statewide implementation has resulted in the following accomplishments that ensure a quality data system in Pennsylvania:

  • Established SLDS Vision for a Birth to 20+ SLDS
  • Established a strong PIMS Governance Board – senior executive leadership of PDE
  • Established a strong, active pilot group of 15 LEAs, including our biggest, Philadelphia SD
  • Completed successful pilot of Schools Interoperability Framework (SIF) technology
  • Contracted for a robust PIMS Level 1 Help Desk at a PA Regional Educational Agency
  • Implemented a PIMS Level 2 Help Desk supported by PDE data stewards
  • Trained over 1,200 LEA staff on data submission and reporting
  • Generated “precode” file to produce 900,000 labels for statewide assessment exams
  • Provided over 200 reports to LEA and PDE administrators and policy makers
  • Replaced 8 existing PDE data collections and supporting systems
  • Collected information on 2006-07 first time 9th graders (Graduation Rate)
  • Implemented PAsecureID in the Pennsylvania State System of Higher Education
    (PASSHE – 14 state universities) and all Community Colleges.
  • Collected data on Student Attendance, School Calendars, Career and Technical Education

With the elimination of eight (8) legacy data collections (and with several more planned system retirements), PIMS has become the single version of truth for Pennsylvania’s education data. PIMS is also capturing the data to enable Graduation Rates to be accurately calculated in accordance with the formula established by the National Governor’s Association. PDE has redirected many internal resources and dedicated them to our SLDS project to support the new enterprise data environment.

The SLDS team has been working in various capacities to ensure the successful implementation of PIMS: writing the initial grant application, developing the RFP, evaluating and selecting the vendor, providing project management, meeting with other states to ascertain best practices, reporting progress to the USDE, leading training webinars for LEAs and their vendors, delivering presentations to various stakeholder associations across the state, bringing together different program offices to resolve issues, keeping the PIMS Steering Committee informed and involved, assessing the quality of reported data and helping the LEAs correct inaccurate data, assisting LEAs with PAsecureID assignments and PIMS file uploads, supporting and communicating with the PIMS administrators in over 800 LEAs, gathering reporting requirements from program offices, maintaining security, and otherwise working countless long hours to make quality data in Pennsylvania a reality.

PIMS has had a major impact on the enterprise. As program offices begin to share the same source of data, they have worked together to create common data definitions and business rules. PDE and LEA staff are learning to use the Cognos tools, allowing more robust data analysis and reporting than was possible with the prior aggregate data collections. As the set of longitudinal data grows each year in the warehouse, the return on investment of state initiatives can be assessed more accurately and funds directed to the most effective programs for students. PIMS has also had a significant impact at the LEA level. The collection of individual student data is causing LEAs to improve the quality of the data in their student information systems. Through Cognos reporting, LEAs have easier access to the data they are submitting to PDE.

PDE has implemented many quality assurance strategies to maintain accuracy of the student matching system. These include:

  • PDE has a full-time team dedicated to data quality for 1.8 million student PAsecureIDs.
  • PDE has also engaged a Regional Education Agency as a Level 1 Help Desk to the LEA’s for assigning PAsecureIDs and reporting into our SLDS (PIMS).
  • PDE has developed queries designed to find both multiple PAsecureIDs assigned to one student and multiple students assigned to one PAsecureID. These queries have been implemented in a database that is linked to the PAsecureID database. The application generates discrepancy reports that are then reviewed in PDE's Division of Data Services, the program area responsible for PAsecureID, and necessary corrections are made. This is an ongoing process, with the discrepancy reports ran and reviewed on a regular basis. These dedicated resources and data quality reports keep Pennsylvania’s PAsecureID error rate well below 0.1 %.
  • The PAsecureID initiative started in December 2005. PDE’s first priority was to integrate the PAsecureID into our state assessment vendor’s database for the spring 2006 state assessments, thereby starting longitudinal records for all 900,000 students taking the state assessments in 2005-06 school year. PVAAS, Special Ed and many other student databases have been integrated with Pennsylvania’s PAsecureID as the unique student identifier.
  • PDE has instituted the Pennsylvania Inspired Leaders (PIL) initiative where Chief School Administrators and their assistants are required to take specified courses that are targeted for Professional Development for our school leadership to keep their certification. PDE is working to add the recently released NCES online course on Data Quality to that curriculum. This NCES Forum Curriculum for Improving Educational Data is a resource for LEA’s. It was developed in cooperation with the Schools Interoperability Framework Association, the Council of Chief State School Officers and a Pennsylvania Regional Education Agency. The courses can be found at ( )

With such a foundation of success, Pennsylvania is uniquely positioned to engage in expansion activities with the state data system over the next five years - but specifically is well established with the necessary foundation of quality data for a growth model as a part of its’ accountability system.