Roosevelt High School
2003 Assessment Workbook
August 2003
This workbook was prepared by the staff of the Research, Evaluation & Assessment Department with support from Management Information Systems. While numerous staff contributed to this workbook, special thanks go to Karin Fallon, Jenny Miller, June Norton, Joe Suggs, Dori Torrence, and Therese White of PPS and to summer interns Elena Barkin, Zohra
Chandiwalla, Rithy Chhit, and Jessica Southward. Questions or comments about this workbook should be directed to Dr. Sue Hiscox, 503/916-2000, extension 4286 or .
Copyright 2003, Portland Public Schools
Assessment Workbook 2003
Table of Contents
How to Use This Workbook i
Checking Assumptions.. ii
School Context.. iii
Assessment Issues
Issue 1
How did our school do against the assessment benchmarks? 1
Graph 1: 2003 School “Meets” and “Exceeds” Percentages
Issue 2
Has our school’s performance improved over time? 2
Graph 2: Total of “Meets” and “Exceeds” Percentages Over Time by Grade
Graph 3: RIT Scores Over Time by Grade
Issue 3
Are our school’s demographic groups performing equally? 4
Graph 4: Total of “Meets” and “Exceeds” Percentages by Ethnic Group in 2003
Graph 5: Total of “Meets” and “Exceeds” Percentages by Gender in 2003
Issue 4
How do special populations compare to school performance? 6
Graph 6: Total of “Meets” and “Exceeds” Percentages by Special Populations in 2003
Issue 5
How successful are long-term students? 7
Graph 7: Total of “Meets” and “Exceeds” Percentages by Group in 2003
Issue 6
Are our students at all performance levels showing growth? 8
Graph 8: Change in Average RIT Scores by 2002 Performance Level
Issue 7
What are our school’s weakest and strongest content areas? 9
Graph 9: Percent Meeting Each Goal for Reading and Math Tests
Issue 8
Did our school meet adequate yearly progress (AYP) standards? 10
Table 10: Overall AYP
Appendix: Supporting Results
Table: Three-year reading and math results for all students 11
Table: Three-year reading and math results by ethnic group 12
Table: Three-year reading and math for long-term students 14
Table: Average RIT gains by spring 2002 RIT status 15
Table: AYP Summary 16
How to Use This Workbook
This workbook is intended for use by school staff and site councils as they analyze their assessment results for the prior year and develop a school improvement plan for the next year. The enclosed summary information addresses eight key questions that schools should ask about the results of their programs on all students. Through the workbook, users can identify key areas of strength or concern and identify additional resources for more in-depth information.
Assessment information from standardized achievement tests is only one source of data for school improvement plans. The workbook should be used jointly with other school-based data (such as grades, local assessments, and surveys) to develop and refine school improvement plans.
Reviewing data is particularly useful if you first consider your own assumptions about how students perform. The Checking Assumptions page that follows allows you to do this. Then move to the assessment data in the rest of this workbook to see where it confirms assumptions or surprises you as one step in understanding your school and where it can improve.
You will find the following sections on each of the subsequent pages:
An Issue
Assessment results in isolation may be interesting, but they are most useful when used to answer questions about student performance. The eight questions in this workbook focus attention on both the performance of students overall and specific subgroups of students. These are the key questions that should impact a school’s improvement plan.
What to Look For
This section gives specific guidance on how to use the graph(s) and Supporting Results at the back of the workbook and on the PPS Research and Evaluation (R&E) website to identify strengths and areas for improvement.
The section is followed by a graph that provides the overview answer to the question. A quick look at this section should answer the basic question for the page. (Note: In some cases, very few students are in a specific category. Small groups are tremendously affected by scores for even one individual and are not useful for reviewing trends. If you have fewer than ten students in a category, you will not be given results for that group.)
Key Findings and Follow Up Questions
In this section, users should note information that surprised them, apparent strengths of the school, and likely areas for improvement based on the graph(s) and supporting results in the workbook or on the web. They should also note how the data in this section relate to other information from grades, in-school tests, surveys, etc.
Supporting Results
The graph in the “What to Look For” section often generates immediate questions. The Supporting Results section at the end of this document contains more in-depth tables that will often provide the answer. In addition, the R&E web site (http://inside.pps.k12.or.us/depts/rne) contains tables that provide considerable grade-level, subject-specific, and group-level information.
Checking Assumptions
Take five minutes to note your assumptions about the following questions. As you look through the workbook, review where results differed from your assumptions.
- In general, would you expect to be below or above the district average scores?
- What percent of your students would you expect to meet benchmarks?
Exceed benchmarks?
Show as low/very low?
- How well is your curriculum aligned to the state standards?
Grade 3 Grade 5 Grade 8 Grade 10
Reading
Math
Writing
Science
- Over that past two years, what changes have you made in your curriculum or teaching methods for:
Reading
Math
Writing
Science
- How would you expect students who have been in your school for 2-3 years to compare with students who have been here less time?
- Is there a grade (e.g., this year’s 5th or 7th graders) that has been outstanding or tough compared to other years?
- How would you expect the following subgroups of students to compare to your school average? Higher? Lower? Same?
–ii–
English Language Learners
Free/Reduced Price Meals
Special Education
Talented and Gifted
African American
American Indian
Asian American
European American
Hispanic American
–ii–
–ii–
Roosevelt High School
Issue 1:
How did our school do against the assessment benchmarks?
What to Look For
The graph below shows the percent of students who meet (solid box) and exceed (striped box) benchmarks. If one grade is different than others, remember that this can be due to a number of reasons including alignment with standards or characteristics of that particular grade’s group of students. It would take more research to determine why a grade may be low or unusually high.
Page 11 in Supporting Results will answer specific questions about your test results compared to the district and show you the percent of students in each performance category. Look at these results as you consider the following questions: ¿Are you satisfied with the percent of students exceeding? ¿Are students moving from “Meets” to “Exceeds” each year? ¿Is the percent of students who are “Low” or “Very Low” acceptable?
Key Findings and Follow Up Questions (Which results surprised you? Which results are you pleased with? What do you need to improve? What questions do you want answered? Be sure to check the R&E web site (http://inside.pps.k12.or.us/depts/rne) for more in-depth information on this issue. Who will get those answers, and how?)
Issue 2:
Has our school’s performance improved over time?
What to Look For
The chart below shows the percent of students meeting standards over the past four years, by grade. Remember that in 2003 the students included in this analysis changed to comply with federal No Child Left Behind legislation. Beginning then, students who took modified or extended assessments, CLRAS, or challenged down to a lower grade’s test counted as “Not Meeting”
¿What grades have shown a history of trending upwards? ¿How does this data fit with your knowledge about the student groups each year? ¿If you see a major improvement or decline in a particular year, did something specific happen to explain it?
Look at page 11 in Supporting Results to consider what would happen to your percent of students meeting benchmark if “Nearly Meets” students moved to “Meets” next year.
Key Findings and Follow Up Questions (Which results surprised you? Which results are you pleased with? What do you need to improve? What questions do you want answered? Be sure to check the PPS Research and Evaluation web site (http://inside.pps.k12.or.us/depts/rne) for more in-depth information on this issue. Who will get those answers, and how?)
Issue 2 (continued):
Has our school’s performance improved over time?
What to Look For
The graph below shows the RIT scores for your students over the past four years, by grade.
¿How do the RIT score trends compare to the chart of students meeting benchmarks? It’s possible to increase the percent meeting benchmarks by moving a few students near the benchmark higher and ignoring ongoing improvements for other students. In such a case, the RIT average would change little from year to year. A better pattern is to see improvements in both percent meeting and average RIT score over time. ¿How close are your average RIT scores to the Meets benchmark for your grade?
Key Findings and Follow Up Questions (Which results surprised you? Which results are you pleased with? What do you need to improve? What questions do you want answered? Be sure to check the PPS Research and Evaluation web site (http://inside.pps.k12.or.us/depts/rne) for more in-depth information on this issue. Who will get those answers, and how?)
Issue 3:
Are our school’s demographic groups performing equally?
What to Look For
The graphs below compare different ethnic groups in terms of achievement. Look at the 2003 data to see the pattern of which groups are succeeding and which are falling behind. Use 2002 bars to see if the pattern is consistent for two years.
¿Are any groups considerably behind or ahead of the European American group? ¿Are some groups closing the gap? ¿Do you see the same patterns for 2002 and 2003?
You will not see information for fewer than ten students in a category. For your own information, consider looking at those students as individuals to make sure they are not all at the lowest benchmark levels. Use pages 12 and 13 in Supporting Results to see how many students you have in each subgroup and how many were tested. This table also shows three years of data to give you a better idea of trends.
Key Findings and Follow Up Questions (Which results surprised you? Which results are you pleased with? What do you need to improve? What questions do you want answered? Be sure to check the PPS Research and Evaluation web site (http://inside.pps.k12.or.us/depts/rne) for more in-depth information on this issue. Who will get those answers, and how?)
Issue 3 (continued):
Are our school’s demographic groups performing equally?
What to Look For
The graph below compares males versus females in terms of achievement.
¿Is there a noticeable difference between males’ and females’ scores? How does the difference
compare to 2002?
Additional data broken down by gender are not available in this workbook. Additional reports, including three-year trends, are available from the R&E web page.
Key Findings and Follow Up Questions (Which results surprised you? Which results are you pleased with? What do you need to improve? What questions do you want answered? Be sure to check the PPS Research and Evaluation web site (http://inside.pps.k12.or.us/depts/rne) for more in-depth information on this issue. Who will get those answers, and how?)
Issue 4:
How do special program students perform on assessments?
What to Look For
This graph compares the performance of students in special programs with the total population of students in the school (shown in the box on the right margin of the graph).
¿Which subgroups are scoring below the overall population? Which ones are closing the gap between their percent meeting and the school average? Which ones aren’t? ¿How well did the data match your assumptions? ¿If students are below the total group average, is the percent of students in “Very Low” and “Low” decreasing over time? Check the R&E website for multiple years of data for each group by subject area and grade to help look at trends.
Remember that some students in the Limited English and Special Education categories are being counted differently this year. If they took a modified or extended assessment, CLRAS, or challenged down, they were counted as “Not Meeting” in 2003. They were not counted in 2002.
You will not see information for fewer than ten students in a category. Consider looking at those students as individuals to ensure they are not all at the lowest benchmark levels.
Key Findings and Follow Up Questions (Which results surprised you? Which results are you pleased with? What do you need to improve? What questions do you want answered? Be sure to check the PPS Research and Evaluation web site (http://inside.pps.k12.or.us/depts/rne) for more in-depth information on this issue. Who will get those answers, and how?)
Issue 5:
How successful are our school’s long-term students?
What to Look For
This graph shows results for students who have been in your school multiple years and how their performance compares to all students at that grade.
¿Are long-term students increasing in the percent meeting benchmarks? Our district goal is that more students will meet standards in the higher grade than in the lower grade. ¿How does their performance compare to the total group of students? ¿If there is a difference, how much is acceptable?
If your long-term students represent most of the students at the grade tested in 2003, you won’t see much difference between their performance and that for the total group. Use page 14 in
Supporting Results to identify the percent of long-term students in your school and to compare scores to the students who have been in your school less time. The R&E website provides more detailed information about specific groups of long-term students.