Archived Information

SERVE’s Response

to

Interim Evaluation of the SouthEastern Regional Vision for Education Report

I.Introduction

We appreciate this opportunity to respond to the interim evaluation report for the Regional Educational Laboratory at SERVE. The report seems to reflect well on the hard work of the evaluation panel and the excellent coordination of the study by the contractor. We commend both the panel and the contractor for the professional manner in which they carried out this work.

This response includes a brief introduction, followed by comments on the panel's responses to each of the eight evaluation questions and summary comments. An appendix offers staff’s clarifications for factual statements in the panel’s report that might be misconstrued if taken literally or out of context. All staff involved in the site visit have received the panel's report and have had the opportunity to contribute to this response.

As the overview of the report points out, the SERVE organization is just completing a redesign of its structure and operations as a result of a complete turnover in its executive staff. The executive director, John Sanders, joined SERVE August 1, 1998. He brought with him 25 years of lab leadership experience. Jean Williams became the Deputy Executive Director for Programs on February 1, 1999. Williams was previously a program leader here and is very familiar with the southeast education community. Richard Basom became the Deputy Executive Director for Planning and Development on June 1, 1999. Basom previously held leadership positions in two other regional educational laboratories. Elliott Wolf, SERVE's Director of Operations, has been with the lab since its inception in 1990. Before joining SERVE, he was a member of the program staff for the previous contractor of the southeast regional educational laboratory. This experienced executive management team is responsible for the operation of the Regional Educational Laboratory at SERVE. The operation is distributed across 11 organizational units that report to the deputies. It is geographically distributed across three office locations—Atlanta, GA, Greensboro, NC (corporate headquarters on campus of UNCG), and Tallahassee, FL. The lab is guided by a 40-member Board of Directors that convenes twice a year and stays informed about lab work through document review and selective participation in lab-sponsored events. The Executive Committee of the Board meets more often (quarterly) to provide even more oversight continuity.

As the panel's report reminds, SERVE's mission is to promote and support the continuous improvement of educational opportunities for all learners in the Southeast. The "continuous improvement" notion is deeply imbedded in the organizational culture of SERVE. We reflect on and evaluate our activities so that we can improve our services and products to our customers. We constantly monitor the satisfaction level of our customers. In keeping with our mission, we accept with appreciation the improvement recommendations in the panel's Synthesis Report. This response, coming only two weeks after our receipt of the report, reflects only our initial reading and reflection about the report. Generally, it seems the panel's recommendations reinforce staff and board members' perceptions about the strengths and noteworthy accomplishments of this organization as well as the targeted improvement efforts underway at SERVE as a result of our recent reorganization.

In addition to the above structural changes/features of SERVE, it is important to point out some areas of emphasis in our operations. First, quality assurance is integral to SERVE's work. We are a research and development organization affiliated with the University of North Carolina at Greensboro. To continue to receive the high customer satisfaction ratings the panel cites, we need to continuously review and improve our quality assurance processes. Second, SERVE extensively evaluates its program work with assistance from an internal evaluation unit led by an experienced evaluator-manager, Jerry Natkin. However, Natkin and staff determined that it would be useful henceforth to use a third-party evaluator to audit the internal work. That will be the procedure used in the future. Third, the evaluation focus of the programs in the fourth and fifth years of this contract period will include student achievement effects. The first years of the contract focused on getting innovations up and running as intended. Now we can look productively at student achievement effects as well as factors that affect taking innovations to scale. Fourth, with three principal offices and distributed staffing, SERVE puts heavy emphasis on communications--between and among programs, offices, projects, and partner/network organizations. To that end, the internal technology unit directed by Gregory LeePow, provides state-of-the-art LAN/WAN communications infrastructure across the organization. Audio and video connections are used to keep staff in touch with each other and with the work of SERVE. Regular meetings that utilize this infrastructure are held for various staff task-groupings and for the staff as a whole. Fifth, a database that includes information about SERVE customers by state and by role group is one of the products that SERVE is committed to produce during this contract period. This database will enable SERVE to improve its understanding of program effects and of the emerging needs of its clients. These five selected features of SERVE operations provide a basis for the following commentary on the panel's report.

II.Implementation and Management
  1. To what extent is the REL doing what it was approved to do during its first three contract years?

In its report the panel stipulates that SERVE is current on its REL contract activities/products, the strategy of having SERVE policy analysts housed in the state capitols is working as planned, collaboration and establishing networks/partnerships/ alliances seems to be a strength, and in its relatively brief history as a regional educational laboratory SERVE “...has created a powerful infrastructure for wide-scale impact across its service region.” To capitalize on these strengths, the panel suggests greater attention be paid to “...substantive communication and coordination/collaboration between projects, within programs, and across programs.”

Our Response to the Recommendations:

We concur that communication within and across offices and staff teams is critical to the success of SERVE's organizational structure and REL program strategy. The executive management team (which meets weekly) and the unit managers team (which meets at least quarterly) regularly review our progress in this area. In addition to these staff groups, the staff of individual programs meet regularly to review, plan, and evaluate their efforts. To assist staff teams with their communications, SERVE provides state-of-the-art communications in each of its offices—800-access to voice and video lines for staff and customers, e-mail and Website access with LAN/WAN architecture and support. In keeping with its commitment to review and improve communications between and among staff groups, the Executive Director and two Deputy Executive Directors had meetings in each of the three main offices for all staff immediately following the departure of the interim evaluation panel. The purpose of these meetings was twofold: to review the oral report of the interim evaluation team and to plan a series of staff meetings leading up to a two-day, all staff meeting in mid-December. The mid-December meeting will feature the rollout of a refined strategic plan for SERVE that benefits from consideration of improvement suggestions in the panel's report as well as the intense work of various staff groups guided by our internal evaluation data. This is just one example of the seriousness with which we view the panel's observation about the importance of cross-organizational communications, given SERVE's complex structure and large service area.

  1. To what extent is the REL using a self-monitoring process to plan and adapt activities in response to feedback and customer needs?

SERVE takes great pride in the reputation it has built with its clients for high-quality, relevant, and responsive work. In fact, the panelists noted that SERVE’s

“… responsiveness to customers is one of the major overall strengths of this Laboratory. That is probably one reason why SERVE’s work is so well received and why in its short life it has won the praise and confidence of state department officials, its Board, and its clients.[1]”

While the panel cited many strengths regarding SERVE’s record of responsiveness to customer needs and requests, they also pointed out opportunities for improvement and offered several recommendations. As mentioned in the introduction to this response, by the time of the site visit, SERVE had already initiated an ambitious improvement effort. For the most part, we agree with the feedback from the panel and are encouraged that it confirms the direction of our plans.

Our Response to the Recommendations:

“Institute a rigorous, external and totally independent quality assurance process, linked with increased internal QA.”

We have already begun to address this recommendation. First, SERVE has begun to assemble a panel of external evaluators that will be involved in regularly scheduled site visits (semi annual or annual) to contribute to a continuous reflection process with SERVE staff. This panel will consist of three to five experienced evaluators who will review proposed R&D project plans, evaluation designs, and R&D findings for research rigor. In addition to this panel of evaluators, SERVE plans to contract with an external evaluation center (e.g., Western Michigan University) to routinely assess all evaluation papers/products using the Program Evaluation Standards:How to Assess Evaluations of Educational Programs, The Joint Committee on Standards for Educational Evaluation (2nd ed.).

We are revising our quality assurance procedures for R&D projects to ensure more emphasis on research-based evidence of effectiveness for the literature base upon which our work stands and for the claims of effectiveness we can make about our products based upon empirical evidence. As part of this effort, our procedures require that project plans undergo a more systematic external review by content and methodological experts before approval and implementation.

“Build ways to define and gather student and school success data at the construction phase rather than at the end of development, and/or utilization of Lab products and services. See it as a goal (with steps toward its attainment), rather than a by-product (of teacher training or other reforms).”

The emphasis on the monitoring of student achievement at SERVE reflects a major and relatively recent shift in programmatic emphasis found in most (or all) regional laboratories. The original request for proposal (RFP) for the regional laboratory program did not emphasize student achievement as measured through state assessments as one of the major outcome measures for which regional laboratories would be held accountable. Thus, the original technical proposal did not focus on student achievement, but on other outcome measures. Our focus on student performance, particularly as measured through state assessments, has come as a response from the field (i.e., our customers—OERI and states).

The panel correctly recognized that student achievement is a recent addition to our work. By the time of the DIR site visit, SERVE had already begun to incorporate the measurement of student impact into our research and development projects and programs. In addition, the new R&D procedures will require projects to gather and analyze student impact data. This will enable SERVE to make cogent claims of effectiveness about its R&D products based upon empirical evidence.

“Build in critical analyses that might allow adaptations, changes, and growth of a given program to make it better or to learn how to target and adapt the existing program for particular populations.”

The purpose of the external evaluation panel and evaluation review process is to promote more critical review of products and projects. In addition, SERVE program director meetings will continue to promote cross-program sharing of lessons learned. The SERVE’s Publications and Quality Assurance Unit already has plans to begin conducting focus groups to gather more systematic feedback from target populations on products under development. One purpose of these focus groups will be to identify how a particular product can be adapted and better targeted for other populations. External reviewers will also provide more critical input into our project work.

“When screening materials and programs that might shape or be used in Lab projects, limit the term “research-based” to empirically supported programs and those grounded in data-driven, demonstrated student success. Add understanding of their strengths and limitations, the limitations of the literature-based knowledge, and the levels of tentativeness that are inherent in the term “research-based.” There is a need to be vigilant because of the scalability factor in the infrastructure—one must refrain from disseminating anything with less than full honesty about its potential for success. The scope of adding this level of rigor is obviously beyond the reach of current staffing and organizational mechanisms and will need to be done through an external system of networks or evaluators, or through a combination of internal-external controls.”

This recommendation serves as a paramount reminder for all SERVE staff—and should be reinforced in any good R&D organization. One way that we are striving to institutionalize this tenet of R&D at SERVE is to constantly challenge our project staff regarding the claims of effectiveness they can make about their products and the cogency of the evidence they use to make those claims. External reviewers (content and methodological) will also bolster our rigor and critical analytic capacity. Periodic seminars among staff and emphasis from the executive management team will help keep us “vigilant.”

III.Quality

To what extent is the REL developing high quality products and services?

The panel's report recognized the "range of quality products and services that make up its [SERVE's] two Signature Works, the first relating to topical areas of assessment, accountability, and standards and the second to broader comprehensive school improvement thrusts." At the same time, the panel had four improvement recommendations. We comment on each of those recommendations in the following paragraphs.

Our Response to the Recommendations:

“Expand collaboration…to enhance programmatic quality.”

We appreciate the suggestion that the two Signature Work Areas have significant collaborative potential that has yet to be fully explored. However, we believe that as SERVE's new program structure takes hold and as the new Deputy Executive Director for Programs implements fully her management and supervisory plans, both the opportunity and incentives for the desired collaboration will be enhanced substantially. Interestingly, the program directors representing the two Signature Work Areas have a history of collaboratively developing highly valued and highly profiled SERVE events--the Regional Forum and the Seminar on Low-Performing Schools, for example. However, we accept that as we evaluate how well our new organizational structure is working we need to look at the extent to which:

  • lessons learned are shared within and across projects/programs;
  • product designs are improved as a result of internal and external critiques; and,
  • services to schools are more carefully targeted and integrated across projects/programs.

"For planned interventions, reach beyond the region to ensure they reflect the most current thinking and research in the field. Use of outside content experts, not necessarily from the region (depending on the issue), is critical to ensure accuracy and timeliness of information being imparted."

We agree it is helpful to use outside experts (researchers and practitioners) to help in planning major SERVE products and activities. Examples abound in SERVE's product portfolio of the value of such collaboration. Achieving Your Vision of Professional Development was a collaborative effort with David Collins, a Florida school executive, that was recognized by the National Staff Development Council as their 1998 Book of the Year. Fifteen practitioners and eight staff worked with Dr. Collins on the publication. A Study Guide for Classroom Assessment: Linking Instruction and Assessment co-authored by SERVE staff and NC Department of Public Instruction staff won the AERA Division H Outstanding Publication Award for 1998. It is clear from these examples that SERVE "knows how to do this." However, it is equally clear that we need to improve our procedures so that this happens routinely. The Executive Management Team will focus on this issue in the context of strengthening the research-base of our program areas. Being part of the University of North Carolina System gives staff access to outstanding expertise—national and international that we should better utilize.