Matrix for Appraising Seed Adp Proposals

LEAP

Learning through Evaluation

with Accountability and Planning

Review tool for the LEAP

Programme Design Document

Programme Information
National office
Programme name
Programme number
Support office name
Programme phase
Start date of this programme phase
End date of this programme phase
Name and position of person submitting the PDD


Review tool for the LEAP Programme Design Document (PDD).

Published: August 2010. Updated: March 2012

© World Vision International

Please provide any feedback on this tool to:

Improved LEAP review tools

The LEAP Review Tools are designed to stimulate a conversation between the programme, the national office (NO) and the support office (SO). Space is provided for a reviewer from each office to give their comments, and for the programme to respond. Space is also provided for reviewers and programmes to agree on actions that should be taken in response to comments. This kind of dialogue should help the programme to identify practical steps that can be taken to improve the programme, and these steps will be mutually agreed between the programme and the support office.

This tool integrates the programme and finance aspects of the document review. Please ensure that both programme and finance staff are involved in reviewing the relevant sections of the document.

All templates and guidance materials for the improved LEAP tools are available on the Horizon LEAP Resources Library, and also on the Guidance for Development Programmes website:

www.wvdevelopment.org

Instructions for using this review tool:

Note: Double click on the headings throughout the document to reveal or hide instructions.

:

National office role

This review template supports the partnership-wide move towards empowering national offices to take responsibility for the quality of their programmes. National offices are now responsible to ensure the completeness and consistency of all their programme documentation.

The level of NO DME capability varies widely across World Vision. Therefore, this template supports a process to build the capacity of the NO to take on increasing responsibility for reviewing programme documents in line with their DME capacity (based on the office’s Programme Capability Review rating).

RO Role

The regional office (RO) will support NOs while they build their document review and programme quality assurance capacity. The RO will also support the NO and SO to make sure that the mechanisms are in place for a healthy partnership. The RO role also includes monitoring the quality and usefulness of reviews. This can be done by checking a sample of completed reviews, or by monitoring issues that arise from the reviews. The RO is not directly responsible for conducting reviews of programme documentation.

Support office role

The responsibility for reviewing the quality and the content of programme documents rests with the NO, but the review process can be supported by SOs where NO capacity still needs to be strengthened. The way in which SOs support the review process will depend on the NO’s DME capability and the capacity of their partner SOs (as determined by the NO Programme Capability Review). One function of the Programme Support Team (PST), which includes NO and SO representatives, is to facilitate the sharing of these responsibilities as NO DME capacity increases.

The SO can continue to use reports for programme support and donor reporting or marketing purposes.

:

Stage 1 - Basic data:

The reviewer completes the Programme Information and the Review Information tables. Review information should be given for both the National Office review and the Support Office review.

Stage 2 - General review:

All National Offices should complete the general review (Section 1, below) on the LEAP documents before they are posted to Horizon. This general review can be conducted by national or zonal staff, but not by programme staff. The purpose of this review is to check the quality, accuracy and consistency of the PDD. The general review must be signed off by an NO representative before the PDD can be posted to Horizon.

The SO is not required to fill in the general review.

Stage 3 – Detailed review:

A detailed review of the PDD must be conducted covering the programme issues (Sections 2 & 3) and project issues (Sections 4 & 5). These reviews should be conducted by the NO, because the primary responsibility for quality assurance is with the NO. However, where the NO does not yet have the capacity to conduct reviews of all PDDs, this task can be shared between the NO and the partner SOs. This can be agreed within the PST, depending on the NO and SO DME ratings, as determined by the Programme Capability Review. The minimum requirements are:

·  NOs with a Base DME rating must complete the Basic Data and the General Review for all PDDs. They should review the programme and project sections of the PDD for as many programmes as possible, while still ensuring a quality review. Where necessary, they can request partner SOs to review up to 100% of PDDs. The appropriate proportion of PDDs to be reviewed by the NO needs to be negotiated and agreed by the PST. NOs with a Consolidating DME rating must complete the Basic Data and the General Review for all PDDs. They should review the programme and project sections of the PDD for as many programmes as possible, while still ensuring a quality review. Where necessary, they can request partner SOs to review up to 70% of PDDs. The appropriate proportion of PDDs to be reviewed by the NO needs to be negotiated and agreed by the PST.

·  NOs with an Established or Outstanding DME rating are required to review 100% of their PDDs.

Specific guidance for each section of the review tool is given within the tool and is hidden under these symbols:

ÈDouble click here for further guidance

Double click on the guidance note symbols to show the guidance.

Double click again to hide the guidance notes.

In the ‘General Review’, boxes A to C have been designed to encourage a dialogue on issues raised during the review. The review of PDDs should follow this sequence:

1.  The reviewer (national or support office) can put comments and suggestions on the PDD in Box A of each section. Once all comments have been written by the reviewer in Box A, the review tool should be returned to the Programme.

2.  The programme can then write their response to the comments and suggestions in Box B, and return the review tool to the reviewer (either NO or SO). If all comments have been dealt with in Box B, to the satisfaction of the programme and the reviewer, then the review is closed and filed.

3.  If there are issues that need to be discussed further, then the reviewer and the programme can discuss and agree any further actions that are necessary. These further actions should be written in Box C.

Stage 4 – Alterations to programme plans:

This review tool is designed to link directly with the annual review and planning cycle of the programme. Any changes to the logframe targets, Detailed Implementation Plan (DIP) or the budgets should refer back to the discussions and agreements made as a result of this review.

Stage 5 – Report Review Learning:

Each Programme Support Team should convene an annual virtual learning event. Part of the purpose of this event is to build the capacity of the NO and SO to review programme documentation and to ensure consistency across all reviews. A sample of the most recent document reviews should be used for this purpose. During this learning event:

1.  PST prepares a short trends report on macro/systemic issues noted during all programme document reviews.

2.  The Operations Director/Ministry Quality staff provide a high-level response to the PST on how they are moving to address common issues across programmes, including any follow-up actions to be taken, particularly where an office-wide approach is merited to improve programme quality.

Table of Contents

Please insert a table of contents.

Title

/

Page

1. General review

/

2. Review of programme description

/

3. Review of programme budget

/

4. Review of project description

/

5. Review of project budget

/

6. Conclusions of the detailed review

/

Glossary

Please ensure the glossary is complete.

ADP / Area Development Programme
CCT / Cross cutting themes
CAM / Cost Allocation Methodology
DIP / Detailed Implementation Plan
DME / Design, Monitoring, and Evaluation
HEA / Humanitarian and Emergency Affairs
IPC / Indirect Programme Cost
IPM / Integrated Programming Model (now called WV’s Development Programme Approach)
ITT / Indicator Tracking Table
LEAP / Learning through Evaluation with Accountability and Planning
NGO / Non-Government Organisation
NO / National Office
PDD / Programme Design Document
PSC / Programme Support Code
PST / Programme Support Team
RO / Regional Office
SO / Support Office
T4 / Codes for funding sources
T6 / Codes for logframe
T7 Programming Categories / Code used in SunSystems to capture the 26 programming categories (from P01 to P26)
TD / Transformational Development
WV / World Vision
Review Information
National office review / Support office review
Name and position of person submitting the PDD
Name and position of reviewer/s
Date PDD submitted for review:
Date review completed and feedback sent:
Date response received:
1.  General Review
The national office must ensure this general review is completed before the document is submitted to the support office.
1.1  Does the document use the latest LEAP templates, or relevant donor templates? / Yes / No
1.2  Is the document complete ¾ have all sections of the document been filled in with adequate information. Have all the relevant appendixes been included.
If the answer is ‘No’, which sections of the document are missing or inadequate? / Yes / No
1.3  Is there consistency between the basic information contained within the document? Are the beneficiary numbers, output and outcome targets, and budget figures consistent throughout the document, including the appendixes? / Yes / No
1.4  Is planned growth in numbers of Registered Children (RC) realistic? / Yes / No
1.5  Have the numbers of Registered Children been agreed with the SO? / Yes / No
1.6  Is there evidence that the programme has adequate technical and management resources for successful implementation? / Yes / No
1.7  Are programme and project budgets attached?
Are budget detailed and accurate?
Are the budgets consistent with the narrative? / Yes / No
A. Comments and suggestions on Section 1 from reviewer:
1. 
2. 
B. Response from the programme:
1. 
2. 
C. Actions agreed to close this set of comments
(these must be negotiated and agreed between the programme and the reviewer):
1. 
2. 
Conclusion of the general review
1.8  This general review has been completed by the relevant national office staff.
Issues arising have been discussed and resolved with the programme.
The PDD is now ready for submission to the detailed review.
Name and position of person who completed the general review:
Date on which the general review was completed:
2.  Review of programme description
2.1  Design methodology:
Is there an adequate description of the methods used by the programme to design the PDD?
Is there evidence of an empowering and participatory approach to programme design that actively involves children, communities, vulnerable groups and partners?
for further guidance
Factors to be considered can include:
·  Is a thorough description of the research process given?
·  Is there adequate evidence of meaningful participation of communities, partners, children and vulnerable groups?
·  To what extent do the communities and partners own the design process?
·  Have research findings been analysed jointly with communities and partners?
·  Is there a transparent process for World Vision to report back to communities and partners on decisions made in the design process?
2.2  Analysis of context:
How adequately does the PDD describe the environmental, social and political context of the programme area?
Does this analysis give adequate justification for the programme design?
for further guidance
Factors to be considered can include:
·  Is there adequate analysis of each context issue, including assets and vulnerabilities, causes and trends?
·  How well does this analysis build on what has already been described in the Assessment Report or evaluation?
·  How well does the document describe the nature of vulnerable groups in the programme area?
·  How well does the document describe the nature of civil society in the programme area, and the different groups involved in promoting development?
·  Is there adequate analysis of the role of local government in the programme area?
2.3  Analysis of child well-being status:
How adequately does the PDD describe and analyse the key aspects of child well-being in the programme area?
Does this analysis give adequate justification for the programme design?
for further guidance
Factors to be considered can include:
·  Is there adequate analysis of each child well-being outcomes that are relevant to the programme, including assets and vulnerabilities, causes and trends?
·  How well does this analysis build on what has already been described in the Assessment Report or in the evaluation of the previous phase?
·  Is it easy to understand how each project will contribute to sustained improvements in child well-being?
2.4  Programme rationale:
Does the PDD give a good description of why this programme is relevant and important?
for further guidance
Factors to be considered can include:
·  Is there an adequate description of how the geographic and thematic areas were chosen?
·  Is it clear how the programme contributes to the priorities and perspectives of the local communities, partners and government?
·  Is it clear how this programme contributes to fulfilling World Vision’s national and regional strategies?
·  How well does this programme build on previous programme phases, or on WV’s previous work in the programme area?
2.5  Programme description and approach:
How adequately does the PDD describe the programme goals and outcomes, and how these will be achieved?
Is the programme description and approach appropriate and realistic?
for further guidance