Curriculum Council

Wednesday, December 3, 2014

9:00 AM-11:30 AM Meeting

8:30 Morning Refreshments/11:30 Lunch

-Meeting Minutes-

  1. Welcome

Attendance:Kate Bacher, Glendale SD; Kristen Baughman-Gray, CIU #10; Tracy Boone, Bald Eagle SD; Jill Dillon, Harmony SD; Michelle Dutrow, West Branch ASD; Paul Hetrick, Curwensville ASD; Jacqulyn Matrin, State college ASD; Bethann McCain, CIU #10; Bruce Nicolls, Clearfield ASD; Gregg Paladina, Philipsburg-Osceola ASD; Jamie Russler, CIU #10; Michelle Saylor, Bellefonte ASD; John Zesiger, Moshannon Valley SD

Virtual Attendance: Deirdre Bauer, State College Area SD

  1. CIU Updates
  2. Updates from Kristen
  3. Pennsylvania Learns on iTunes U– Kristen provided an update. To access:

PDE is currently completing courses for middle school ELA, elementary math, and elementary science. The next round of course development will focus on elementary ELA, primary mathematics and Algebra II and geometry, and middle level science (grades 5-8). To join the iTunes course development team, go to access the application. The first round will occur on January 21-23, 2015, with the second round scheduled for February 11 and 12, 2015.

If you are unable to join in course development, consider volunteering to assist us in vetting the courses. If you are interested, let Kristen know. Courses can be reviewed virtually and you can do as much or as little as time will permit.

  1. Galaxy – Let Kristen know ASAP if you would like to schedule a residency.
  2. Nearpod– The IU continues to offer Consortium pricing = $50/teacher/year –OR- $100/teacher/3years. Interested districts may contact Kristen Gray at for purchasing information. Training will be offered soon. Even if you are only using the free version of Nearpod, you can benefit from the training.
  3. Math Design Collaborative for High School Teachers-Workshop is scheduled for February 18, 19, & March 20, 2015. Registration open until February 4, 2015.No cost for the first 24 participants! Register soon!
  4. STEM
  5. TEAMS (Tests of Engineering Aptitude, Mathematics, & Science) Registration Now Open

Registration for the 2015 TEAMS (Tests of Engineering Aptitude, Mathematics and Science) competition is now open. This year’s theme is “The Power of Engineering,” and the competition includes scenarios that explore the relationship between energy and engineering. Students have the opportunity to explore topics such as solar energy, wind and hydro power, nuclear power, alternative fuels, and smart homes through participation in the TEAMS competition. During this one-day competition, teams of four to eight students work together to solve issues related to these engineering topics and to compete for competition day, division, and state rankings. The TEAMS competition brings learning full-circle as teams apply classroom knowledge and research to real-world engineering problems. The 2015 TEAMS competition window is February 9 through March 21.

To learn more about the TEAMS competition and to register, visit

For more information on TEAMS, contact TEAMS program manager Sandy Honour via email or at 888-860-9010.

  1. Engineering byDesignCheck out this video of EbD in action:
  1. K’Nex Design Challenge– Registration for this event is open until December 19, 2014. Only the team leader will need to register. Cost is $40, which covers the cost of the kit. Challenge date is set for April 9, 2015. Tentative location is the CIU #10, unless more room is needed. If a location change is necessary, a notice will be e-mail to team leaders.
  1. Updates from Bethann
  2. Statewide Keystones to Literacy – Non-KtO districts are encouraged to take advantage of the opportunity to build a strong K-12 comprehensive literacy plan for their district.
  3. Text-Dependent Analysis–Two day session to be held at CIU #10 on January 14 & 21, 2015.
  4. Depth of Knowledge
  5. Close Reading
  6. School-Specific Item Development
  7. Interest in Administrator Version? Heather Spotts and Bethann McCain will gather information. If there is interest in this version, a training can be scheduled.
  8. John Collins Argument Writing and Text-dependent Analysis Questions– To be held at the Independent Order of Long Fellows in State College. Registration is open until January 22, 2015. Event is to be held on February 4, 2015.
  9. First in Math – Free one year access available to districts who have never tried First in Math. To access, contact and copy . Note that the price becomes $6/child once kids solve 1000 problems.
  10. Professional Development
  11. CIU Professional Development Offerings – See the CIU website for current offerings.
  12. Continuing Professional Education Courses – New courses have been posted.
  13. SAS Institute – Dec. 7-9 – Registration is now open for SAS –No restrictions now on number of registrants per district.
  14. Improving Schools Conference– January 25-28, Station Square, Pittsburgh
  15. PDE Data Summit – March 22-25, Hershey Lodge

Target Audience: LEA administrators and LEA data teams including PIMS administrators, child accounting coordinators, curriculum directors, special education data managers, special education directors, assessment coordinators and technical directors will all find workshops to enhance their skills and knowledge.

Strands:

  • Data Quality, Governance and Management
  • Systems Integration and Standards
  • Data Use (Analytical, Instructional, Fiscal decision-making
  • Technical (How-to, hands-on sessions)
  • Special Education
  • PDE Collection Owners

Registration fee:

Early Bird (12/31): $250 (includes overnight), $200 (commuter)

Regular: $300 (includes overnight), $250 (commuter)

  1. 2nd Annual Region 6 Leadership Institute at Toftrees – July 8-10, 2015 – Please put this event on your summer calendar. We hope you will join us. Plans are already underway for a great institute!
  2. Advanced Placement Summer Institute – August 3 – 6, 2015 – Pamphlet outlining these courses coming soon!
  3. Courses to be offered:
  4. Spanish Language and Culture (Maria Vazquez-Mauricio, Instructor)
  5. Statistics (Kenneth Pendleton, Instructor)
  6. Physics 1: Algebra-Based (Patricia Zober, Instructor)
  7. U.S. History (Ed Austin, Instructor)
  8. 5th Course Added: Environmental Science (Tim Anderson, Instructor)
  1. PDE/PAIUCC Updates & Discussion
  2. Educator Effectiveness

For copyright permission for using the Framework for Leadership or Educational Specialist Rubrics within an electronic system, contact Deb Wynn at . She will send you the necessary paperwork to secure permission.

  1. SPP

“A new feature of the SPP site is the ability to view trend analysis. Users can select the graph icon and see performance trends from Year 1 of the SPP to the current display. This feature is available for the building level academic score and for each performance measure.

An addition from last year is the ability of an International Baccalaureate (IB) school to earn extra credit for advanced achievement for 12th-grade students scoring 4 or higher on an IB exam taken over the course of their educational experience. This is in addition to extra credit for 12th-grade students scoring 3 or higher on at least one AP exam taken over the course of their educational experience.

The 2013-14 School Performance Profile scores have been calculated based upon 2013-14 assessment data, including PSSA, PASA, and Keystone Exam performance as well as progress schools have made in closing the achievement gap in science. As Pennsylvania is transitioning to more rigorous standards and assessments in English Language Arts (ELA) and Mathematics, closing the achievement gap data/results will not be reported until 2015-16 for ELA/Literature and Mathematics/Algebra I as 2014-15 will serve as the baseline year for these subjects.” (Source Media Kit Sample Newsletter Article)

Now that the writing assessment will be embedded in the ELA test, the ELA scores will be doubled in the formula for 2014-15 and beyond. The 2013-14 formula and the 2014-15 formula are both posted on the Curriculum Council site.

District-level SPPs will be released soon through the secure site only. However, if PDE gets a Right to Know Request, these will become public.

  1. FIRST

The Fidelity Implementation Review and Support Tool (FIRST) is now expected to be available Jan. 1, 2015. The tool is designed to give teachers a system for monitoring whether or not their curriculum is being implemented with fidelity. It is a web-based system which is currently being developed only for tested grades and subjects. All standards and eligible content are listed. Teachers go into the system daily or weekly to mark the standards and eligible content as follows:

  • Red – Not taught
  • Yellow – Introduced
  • Green – Reinforced
  • Light Blue – Students Understand
  • Dark Blue – Students Have Mastered

You can download quizzes to formatively assess certain content. It is designed to be very simple to use.

  1. PVAAS

PVAAS has developed a reflection guide related to teacher specific reporting – encouraging folks to ask tough questions about the data and what it means for curriculum, instruction, etc…

You have access to a demo account to see teacher reports to use in training your teachers. See access codes in the boxes that follow. Simply type the applicable user name and password into the login page on the PVAAS site.

Below are answers to some frequently asked questions (Source=PVAAS Team PAIUCC PowerPoint):

Q. Why isn’t there a report for a teacher who rostered?

A. To actually be included in PVAAS Teacher Specific reporting, a student MUST:

  • Have a PSSA or Keystone score from the most recent year
  • NOT be a foreign exchange student
  • NOT be a first-year ELL student
  • NOT be Proficient or Advanced on a PRIOR Keystone exam
  • NOT be claimed at less than 10% instructional responsibility
  • NOT have tested with the PASA (alternate assessment)

In addition, the teacher must meet the Minimum N Counts

  • Must have at least 11 students
  • Must have an “Active N Count” of at least 6 students/6.0 FTE Students

Q. Why is there no Individual Student Growth Measure?

A. Growth is about a GROUP of students NOT an individual student. Error would be too large for an individual student to know if they made growth or not.

Q. Why no predicted score Now for SY 2014-2015?

A. For assessments analyzed with the predictive model, each student’s predicted score is created as part of the value added analysis. A student’s predicted score is based on the student’s own testing history and on the average performance of students in the same cohort statewide who have a similar testing history. In other words, for each assessment we analyze with the predictive model, we look at the past testing histories of students across the state who took that assessment in the most recent year. We then determine the relationship of students’ testing histories to what the students actually scored. Then, for each student, we generate a predicted score based on how other students with similar testing histories actually scored, on average. As a result, the predicted score represents what each student needed to score to make progress that was typical for academically similar students statewide.
Because the predicted score is based on the performance of other students statewide who took the assessment in the same year, it’s not possible to generate the predicted score prior to the assessment.
Student projections are similar to predicted scores, but with one key difference in how they are generated. Like the predicted score, a student’s projection is based on the student’s own testing history and on how students with a similar testing history actually performed on the assessment. The difference is that the similar students are in a different cohort. In other words, these are students who have already taken the assessment we’re projecting to. The student for whom we’re providing the projection has not yet taken the assessment. Because we use the most recent cohort of testers, we can provide the projection before the student takes the assessment. For example, if we are generating projections to Algebra I, we will look at the progress of students who have just taken the Algebra I assessment. We will then project that students who have not yet taken the Algebra I test will make progress similar to that of students with similar testing histories who have already taken the Algebra I assessment.
The projection indicates how a student is likely to score, if the student makes progress that was typical for students with a similar testing history who took the assessment in the most recent year. While the projection is a reliable measure of a student’s entering achievement level, we don’t recommend that you compare the projection to the student’s actual score on the assessment. This kind of simple comparison does not account for measurement error so it does not yield a reliable measure of the student’s growth. A student’s predicted score is based on the student’s own testing history and on the average performance of students in the same cohort statewide who have a similar testing history.

In other words, for each assessment we analyze with the predictive model, we look at the past testing histories of students across the state who took that assessment in the most recent year.

Q. Why is there a difference in my School Value-Added vs. Teacher Value-Added Reporting?

A. PVAAS School Reporting uses full academic year as a requirement for including students

PVAAS Teacher Value-added reporting uses % Student +Teacher Enrollment and % Shared Instruction to determine the weighting of each student on each teacher’s PVAAS teacher Specific Reporting

The standard error around the growth measure will be larger for a teacher versus for the teacher’s respective school given the number of students and prior test scores available to use in the analyses (i.e., the smaller number of students, the larger the standard error).

Having a smaller number of students in individual classrooms versus the overall school can make it such that PVAAS Teacher Specific Reporting may yield different colors than PVAAS School Reporting, especially if a school is on the cusp of red, you might have teachers only falling into yellow or green.

The analytic model that generates the teacher reports is more conservative than the model that generates the school reports because it includes additional protections for teachers. These protections help to ensure that teacher reports generated from relatively smaller numbers of students do not yield measures that are unreasonably high or unreasonably low. This protects teachers from misclassification due to small amounts of data.

Q. What is the Composite score?

A. It is the combined growth measure across a teacher’s PVAAS reported subjects/grades/courses within SY13-14. For SY13-14, we only have one year of reporting so the Composite Score is a combined growth measure across a teacher’s PVAAS reported subjects/grades/courses for SY13-14 only. In future years, the Composite Score will represent a combined growth measure across a teacher’s PVAAS reported subjects/grade/courses across years, up to and including 3 consecutive school years.

•1 year Composite

•2 Year Composite

•3 Year Composite/3 Year Rolling Average

Q. How is the Composite Score calculated?

A. To calculate the composite, a simple average is taken of all of the teacher's individual index values for up to three years. Then the average is multiplied by the square root of the number of individual index values that went into the average. This step is a necessary step that accounts for the fact that more data was used to generate the average than was used to generate each individual index, which affects the standard error for the composite.

Q. Why is there a difference between my Composite v. my Teacher Value-Added?

A. It is all about the amount of evidence in the student assessment results –the assumption is that the achievement level of the teacher's group of students is maintained (green) UNLESS there is enough evidence in the assessment data to say otherwise.

The more data available, the more evidence we have to see if the group of students exceeded the growth standard OR not.

There is more evidence when all data was combined for a composite score. With data from multiple subjects (or multiple years when that’s available) included in the Composite, there’s more evidence/more data.

An analogy- we can estimate how many blue M&M’s are in bags of M&M’s by looking at one bag, BUT the error would be a little larger than if new looked at more bags. We will have a better estimate/more evidence/less error on our measure if we look at 100 bags of M&M's.

Remember, for PVAAS the assumption is that the achievement level of the teacher's group of students is maintained (green) UNLESS there is enough evidence in the assessment data to say otherwise.

If we have 2 different indicators that separately are indicating moderate evidence of exceeding the standard, would you agree that when combining those two different indicators we would now have even stronger evidence of exceeding the standard?

So, it is not surprising that when we combine 2 separate indicators of moderate evidence of exceeding the standard (i.e., +1.9 and +1.1…both light blue) that we now have significant evidence of exceeding the standard instead of just moderate evidence (i.e., combining +1.9 and +1.1 gives you +2.12 or dark blue).

In the same way, if we have 2 different indicators that separately are indicating moderate evidence of falling short of the standard, would you agree that when combining those two different indicators that we would then have even stronger evidence of falling short of the standard?

So, it is not surprising that when we combine 2 separate indicators of moderate evidence of falling short of the standard (i.e., -1.42 and -1.81…yellow and yellow) that we now have significant evidence of falling short of the standard instead of just moderate evidence (i.e., combining -1.42 and -1.81 gives you -2.28 or red).

The composite is not really an “average,” but rather an accumulation of the evidence/data towards meeting, exceeding, or falling short of the standard for PA Academic Growth.

Q. Why is there a difference between my Teacher Value Added vs. my Diagnostic Reporting?

A. Keep in mind that on the Value Added reports, students are weighted in the analysis based upon the total % of Instructional Responsibility for that student. However, for Diagnostic reports, students are weighted equally. The Diagnostic reports reflect the growth of a group of students who may have had more than one teacher with instructional responsibility in that specific subject/grade, or course. These are to be used for diagnostic purposes only, not evaluative purposes.