Beyond Information. Intelligence.
July 28, 2009
Anna Viggiano, Ph.D.
Educational Specialist
Gifted and Talented Program
475 22nd Avenue
Honolulu, Hawai‘i 96816
Dear Dr. Viggiano:
We are pleased to present the report on the Statewide Learning Center Evaluation Report for SY 2008-2009. To facilitate comparison over time we have recalculated all scores for 2006, 2007, 2008, and 2009 using the same procedures.
We thank you for the opportunity to have worked with your program, and look forward to working with you again.
Sincerely,
James E. Dannemiller
President
CONTENTS
I. INTRODUCTION 1
A. The Learning Center Program 1
B. Program Evaluation Procedures 2
II. EVALUATION RESULTS FOR 29 PROGRAMS, 2009 6
A. Evaluating the Hawai‘i Learning Center Program, 2007 6
B. Learning Center Criteria Scores, 2009 8
C. Evaluation Score by Program Type 8
D. Comparisons with Previous Years 9
III. APPENDIX 12
Appendix A: Narratives for Learning Centers 13
Appendix B: Survey Questions 146
Appendix C: Critical Review of Scores 153
LIST OF TABLES
Table 1: Data Collection Instruments 2
Table 2: Description of LC Program Criteria, Data Sources & Data Points 3
Table 3: Five Learning Center Performance Profile Categories 5
Table 4: Statewide Learning Center Evaluation: LC Achievement Score Profiles 7
Table 5: Evaluation Scores by Program Type 9
Table 6: Eleven-Year Summary of Statewide Learning Centers: LC Profiles 11
I. INTRODUCTION
A. The Learning Center Program
Hawai‘i’s Learning Center Program is a statewide categorically funded program designed to provide Hawai‘i Department of Education (DOE) students and their parents with greater academic choice by creating distinctive, specialized, and excellent programs within public education.
Learning Centers are established within existing DOE schools. Each center is a specialized program organized around a single theme or subject area. Each is expected to provide innovative and excellent opportunities for learning, using existing resources at the host school and tapping additional resources in the surrounding community. Learning centers are open to all public school students regardless of district or school boundaries. The sole criterion for admission is an interest in acquiring or developing particular knowledge and skills available at the Learning Center.
Programs offered at Learning Centers are expected to enrich and expand student learning experiences by introducing new or formerly unavailable courses or activities. They provide adjunct enrichment experiences through community involvement. They provide new and expanded experience by integrating existing courses with the host school’s program of study.
Learning Centers are expected to provide equal educational opportunity through truly open choices for public school parents and their children. The centers are not to serve as special programs for students who do not succeed in regular schools, nor are they intended to serve only the very bright or most gifted students. Centers are prohibited from excluding or segregating students because of race, financial or social status, academic achievement, or previous educational experience. Admission to entry level courses must be open to all, based only on interest and space[1]. Learning Centers are expected to provide automatic geographic exceptions to students who request them.
The Hawai‘i Learning Center program has four goals:
1. Expand educational choice for public school students with special interests and talents.
2. Provide public school parents with new choices about the kind of education they want for their children.
3. Make efficient use of educational resources, such as facilities, staff and equipment.
4. Encourage school-community collaboration and use of high quality, technologically advanced community resources.
B. Program Evaluation Procedures
Hawai‘i’s Learning Center Program has been evaluated nearly every year since it began in 1987. Between 1987 and 1993, Learning Center evaluations were conducted on an ad hoc basis either internally or with the assistance of evaluation contractors. In 1994, an external evaluation laid the groundwork for a comprehensive program evaluation system that was comparable from year to year. External evaluation continued through 1996. In 1997, the data collection system for learning center evaluation added the Automated Data System (ADS), a computer-based data collection and analysis system for learning centers. The ADS was updated and fine tuned in 1998, and was used for internal program evaluation from 1999 through 2002. In SY 2003-2004, the Office of Instructional Services engaged SMS to collect evaluation data from learning centers, assistance as requested by Learning Center coordinators, assemble the ADS data, perform the data analyses, and prepare the evaluation report. SMS has performed those evaluation services since 2004. For the present report, SMS made use of the ADS to assess the outcomes of the Learning Centers in School Year 2008-2009.
Data Sources
Data used to evaluate Learning Centers are gathered from several sources using a set of six data collection instruments shown in Table 1. All of the survey instruments are available either on-line or for printing from the ADS. To facilitate planning for continuous improvement, the data collection instruments have not have been altered since 1994.
Table 1: Data Collection Instruments
Instrument /Description
Student Survey / Self-administered survey to gather opinions and behaviors from LC students.Parent Survey / Self-administered survey to gather reactions and opinions from LC parents.
Coordinator’s Report / Lengthy report completed by Learning Center Coordinators to report data on LC themes, programs, selection and enrollment procedures, special events, achievements, etc.
Teacher Survey / Self-administered survey to be completed by host school teachers whose skills or courses are integrated within learning center program offerings.
Administrator’s Survey / Self-administered survey instruments designed to gather information on how the LC is integrated into the school mission and how it is valued.
Grades and Scores / A format provided to the DOE SIS program for reporting the gender, ethnicity, grades, and SAT reading/ math scores for all students in host schools
Objectives
As part of the evaluation design process formalized in 1994, a set of Learning Center Program objectives was developed by the evaluator and approved by the Learning Center Coordinators Committee. Beginning with the four program goals, and considering expectations of Learning Centers from within and outside of the Program, a set of 11 evaluation objectives were defined, and have remained unchanged since 1994. Those 11 objectives (See Table 2) were the central focus of the Learning Center Program evaluation this year.
Table 2: Description of LC Program Criteria, Data Sources & Data Points
Criteria / Description of Program Criteria / Data Sources / Data
Points
Expanded
Choice / Measures the extent to which LCs offer learning experiences that are unavailable in the regular classrooms. Are the program offerings organized around a theme, are they different from regular school offerings, and do they enrich student options? / Students
Parents
Teachers
Administrators
Coordinator / 21
Collaboration / Measures the extent to which the LC collaborates with business, professional, and community people and how well that collaboration contributes to the host school. / Teachers
Administrators
Coordinator / 33
Integration / Measures the extent to which an LC integrates the host school's teacher skills and school facilities into its program and its Advisory council; and how well the LC garners support from and communicates with the non-LC staff and administrators. / Parents,
Teachers
Administrators
Coordinator / 23
Student
Achievement / Measures how well the LC contributes to the skills and learning of LC students. / Students
Parents
Teachers
Coordinator / 8
Personal
Growth / Measures the extent to which students, parents and coordinators feel the students experience growth in LC activities. / Students
Parents
Coordinator / 10
School
Improvement / Measures the extent to which teachers, administrators, and LC coordinators feel that the host school benefits from having an LC on campus. / Parents
Teachers
Administrators
Coordinator / 18
Quality of Program Resources / Measures the extent to which elements of a quality program exists at the LC, including high quality curriculum, innovative practices, active learning, clear standards, multiple types of assessment, excellent equipment, facilities, and adequate resources. / Students
Parents
Teachers
Administrators
Coordinators / 44
Equal Access / Measures the extent to which the LC has students from outside their geographic location, how well the LC reaches out to students beyond school boundaries, whether the program meets the needs of students in and outside of school boundaries. / Parents
Teachers
Administrators
Coordinator / 26
Equity / Measures the extent to which the LC students represent a cross-section of the socio-demographic composition of the host school's student body and of their academic abilities (represented by norm-referenced test scores). / Teachers
Administrators
Grades
Scores / 26
Recognition / Measures the extent to which the LC program, teachers and students have received national, state-district, and school local community recognition (awards, recognition, publications). / Coordinator / 12
Constituent Evaluation / Measures the extent to which LC constituencies (parents, teachers, students and administrators) want to continue with the LC at the host school. / Students
Parents
Teachers
Administrators / 6
Analysis
The 11 scores used to measure Learning Center progress toward program goals are based on 227 individual measurements taken from the six sources shown in Table 1. Survey data are collected on-line or using paper-and-pencil forms. If surveys are completed on-line, responses are recorded automatically in the LC Database for each school. If data are collected on paper surveys, responses are entered to the ADS at the Learning Center site[2].
Once data are entered for all six data sources, ADS calculates analyses automatically. No further input from LC or evaluation staff is required or permitted. Since 2004-2005, SMS performs the analyses for individual centers and combines the data into the final evaluation data file. SMS aggregates data for individual surveys, summing responses separately for students, parents, teachers, and administrators. The ADS program isolates individual items that make up each of the 11 scores, and calculates summary scores for each one[3]. Formulae for generating scores were developed in 1994, adjusted in 1997, and two minor adjustments were made in 2008[4]. To facilitate comparison, SMS adjusted the two altered scores and the total score for program years 2004-2005 through the present.
Scores
All scores used to evaluate Hawai‘i Learning Centers are "excellence scores." Standards for evaluation are rigorous and in keeping with the understanding that Learning Centers are the DOE’s flagship programs. They are expected to incorporate only the highest levels of quality in all program components. Each of the 11 scores calculated in the ADS programming summarizes 5 to 44 individual data points taken from the surveys shown in Table 2. Regardless of the measurement metric used in the surveys, ADS scoring procedures reduce individual survey data to a simple “excellent vs. all other” scale. Summary scores for each of the 11 components are rescaled such that each has a range from 0 to 100. For a score like the Program Quality Score, a zero would mean that none of the 44 component items for that score reached the level of excellence. Even if all 44 items were one point below the excellence level, the Program Quality Score would be zero. A score of 100 means that every one of the 44 component items of the Program Quality Score was rated as “excellent” by students, parents, teachers, administrators, and the Learning Center Coordinator.
The LC evaluation design is very demanding. Over the years there have been those who feel it should be relaxed, but most observers agree that excellence is the appropriate standard for evaluating learning centers in Hawai‘i. In past evaluations many centers have achieved scores above 80 and some consistently score above 90. New centers have been able to improve their scores over time. This suggests that excellence criterion is a reasonable, albeit stringent, criterion for LC program evaluation.
Using excellence scoring, it may be inappropriate to interpret a score of 50 as indicating a “bad” or “failing” program. An ADS score of 50 means that “only” half of the items used to judge a program reached the level of excellence. They may all be one point below excellence.
Finally, using ADS scores to compare one program against another has proven to be ineffective. Learning centers have unique programs with components that are usually not transferable. ADS scores measure where LCs are in their pursuit of excellence, and identify areas that need improvement. Year-to-year change in performance, relative to program performance objectives, is the appropriate measure of success or failure.
Reporting
The present evaluation reports summary results for each of the 11 objectives and lists scores for each Learning Center. The scores were produced by SMS using the ADS and included no input from LC staff or coordinators. Scores are presented for review by Learning Center Coordinators. Analysis details, including copies of the ADS database and printouts of the summary scores are submitted each year to the DOE Evaluation Section.
Since 1998, Learning Centers have been classified in one of five categories listed in Table 3. The classification system is based on each Center's performance profile using ten program criteria (the Constituency Evaluation Score is not used in this classification system). The five program criteria were collaboratively derived and based on the goals and intent of the State Learning Center Program as specified in the 1998 Guidelines and revised in Fall 2000.
Table 3: Five Learning Center Performance Profile Categories
High Performance: Scores of 75 or higher on at least half of the 11 program criteria.Solid Performance: Scores between 50 and 75 on more than half the 11 criteria.
Mixed Performance: Some scores of 75 or higher, some scores less than 50, and some between 50 and 75
Problem Performance: Dominated by scores below 50 with some scores between 50 and 75.
High Need: Scores of less than 50 on more than half of the 11 criteria.
The preferred evaluation system for an individual Learning Center compares current excellence scores against previous performance. The ideal program management strategy would use the scores to develop specific program improvement objectives for the next year, then develop program plans to achieve those objectives, and measure performance against those objectives. For example, a Center Coordinator might notice a drop in the Integration Score and make plans to reverse that next year. A plan to achieve that objective might involve recruiting specific host school teachers to participate in Learning Center programs next year. The Coordinator might set the objective to increase the Integration Score by 12 points, and measure success on that objective by subtracting next year’s Integration Score from this year’s Integration Score.