Meta–Evaluation Checklist

Concept of Evaluation

_____Definition – How is evaluation defined in this effort?

_____Purpose – What purpose(s) will it serve?

_____Questions – What questions will it address?

_____Information – What information is required?

_____Stakeholders – Whom will be served?

_____Evaluator(s) – Who will do it?

_____Process – How will they do it?

_____Internal communication – How will communication be maintained between the

evaluators, the sponsors, and the stakeholders?

_____Internal Credibility – Will the evaluation be fair to person inside the system?

_____External Credibility – Will the evaluation be free of bias?

_____Security – What provisions will be made to maintain security of the evaluation data?

Contractual/Legal Arrangements

_____Client/evaluator – Who is the sponsor, who is the evaluator?

_____Evaluation products – What evaluation outcomes are to be achieved?

_____Delivery schedule – What is the schedule of evaluation services and products?

_____Editing – Who has authority for editing evaluation reports?

_____Access to data – What existing data may the evaluator use, and what new data may be obtained?

_____Release of reports – Who will release the reports and what audiences may receive them?

_____Responsibility – What is the responsibility of the evaluator?

_____Finances – What is the schedule of payments for the evaluation if any is made?

Evaluation Design

_____Objectives and variables – What is the program designed to achieve?

_____Investigatory framework – Under what conditions will the data be gathered,

e.g., experimental design, case study, survey,

_____Instrumentation – What data–gathering instruments and techniques will be used?

_____Sampling – What samples will be drawn, how will they be drawn?

_____Data gathering – How will the data–gathering plan be implemented and who will gather the data?

_____Data storage and retrieval – What format, procedures, and facilities will be used to store and retrieve the data?

_____Data analysis – How will the data be analyzed?

_____Reporting – What methods will be used to report the evaluation findings?

_____Technical adequacy – Will the evaluative data be reliable, valid, and objective?

Management

_____Organizational–location – Through what channels can the evaluation influence policy formulation and administration?

_____Policies and procedures – What established and/or ad hoc policies and procedures will govern this evaluation?

_____Staff – How and/or will the evaluation be staffed?

_____Facilities – What space, equipment, and materials will be available to support the evaluation?

_____Data–gathering schedule – What instruments will be administered, to what groups, according to what schedule?

_____Reporting schedule – What reports will be provided, to what audiences, according to what schedule?

_____Installation of evaluation – Will this evaluation be used to aid the system to improve and extend its internal evaluation?

_____Budget – What is the internal structure of the budget, how will it be monitored?

Utility Questions

_____Evaluator’s values – Will the evaluator’s technical standards and values conflict with the client system's and/or sponsor’s values; will the evaluator face any conflict of interest problems?

_____Judgments – Will the evaluator judge the program; leave that up to the client; or obtain, analyze, and report the judgments of various reference groups?

_____Objectivity – How will the evaluator maintain objectivity?

_____Prospects for utility – Will the evaluation meet utility criteria of relevance, scope, importance, credibility, pervasiveness, and timeliness?

_____Cost/effectiveness – Compared to its potential payoff, will the evaluation be carried out at a reasonable cost?

A modification of “An Administrative Checklist for Reviewing Evaluation Plans” by Stufflebeam (1974)