REPORT OF THE

EASTERN AND CENTRAL AFRICA

REGIONAL PROGRAM QUALITY GROUP (RPQG) FORUM

4th MEETING

“Strengthening Impact Measurement for Programme Quality – From IMRA to Action”

Nairobi, Kenya

1-3 November 2010

Table of Contents

Day 1 1

Introductions 1

Opening words 1

Meeting objectives and expectations 1

Country Office and CI member updates - Gallery walk of posters 2

Overview of IMRA findings 3

General working discussion of the IMRA 7

Review of core themes – DM&E competence 8

Review of core themes – Knowledge management competence 9

Review of core themes – quality assurance/programme support 10

Synthesis and interpretation of findings 12

Day 2: 15

Update on CARE USA and Impact Measurement 15

CI funding opportunities for impact tracking improvement 16

Action plan preparation 17

Partnering and impact measurement 19

ANNEX 1a. Workshop participants 22

ANNEX 1b. Workshop agenda 23

ANNEX 2. Meeting objectives 24

ANNEX 3: RPQG meeting - Compilation of expectations received pre-workshop 25

ANNEX 4: Executive summary of IMRA report 28

ANNEX 5: CARE Somalia SharePoint platform 34

ANNEX 6: Global IM update 35

ANNEX 7: Programme Quality Framework update 36

ANNEX 8: CO and CI Action plans for IM 37

CARE Burundi action plan for IM 37

CARE DRC action plan for IM 37

CARE Ethiopia action plan for IM 38

CARE Kenya action plan for IM 38

CARE North Sudan action plan for IM (partially edited) 38

CARE Rwanda action plan for IM 39

CARE Somalia action plan for IM 40

CARE South Sudan action plan for IM 40

CARE Tanzania action plan for IM 41

CARE Uganda action plan for IM 42

CI-Europe Group (C-Nor, UK, NL, Austria) action plan for IM 43

CI-USA - CI Members' Action Plan for IM 43

ECA RMU Group action plan for IM (partially edited) 44

i

Day 1

Introductions

Asmare facilitated an opening session where older participants introduced and welcomed new participants to the Regional Programme Quality Group.

[see annex 1 for list of participants]

Opening words

Emma Naylor Ngugi, Regional Director - Programme quality is very high on the agenda for CARE as an organization; programme quality, impact and accountability go far beyond a single focus on donors. We want to use the programme approach to accountably address underlying causes of poverty. We are tired of the kind of aid that does nothing to change the fundamental situation of people. The programme approach is asking us to do much more than reorganize our projects; it requires that we show how the things we do for and with our impact groups will achieve a lasting transformation and change in their lives. We also need to make choices and determine where it is most important to invest to achieve change.

Observation - since I join CARE, one of the challenges that new people face is that joining CARE is like learning a new language. I have seen entire paragraphs constructed of acronyms. To some extent, it is normal for large organizations but I’d like to make a plea to “try to keep it simple”. Development is complicated enough. Part of our role as leaders is to help staff, partners and beneficiaries to join and engage, and our forms of language and communication is critical to that process.

Meeting objectives and expectations

Delphine presented the meeting objectives:

1)  Ensure a common understanding is reached in the region on the key challenges we are facing around Impact Measurement, including Knowledge Management, DM&E and Program Quality issues as they relate to IM;

2)  Obtain / strengthen CARE senior leadership commitment (at CO, RMU and CI members’ levels) to tackling IM related issues in their COs, in line with the program approach;

3)  Collective prioritization of issues to address in the next two years as part of a regional IM systemic capacity building action plan for ECA.

[see annex 2 for details on objectives and outputs]

She followed this with a summary of participant expectations as sent to the workshop organisers by RPQG members prior to the workshop. The leading expectations were:

·  Action plans (10 responses), linked with funding opportunities/plans; mostly at CO level, with a couple for regional plans

·  Setting up IM system (10 responses), linked with guidelines, framework, definitions, key elements; mostly at CO level

·  Impact tracking for projects and short term programming (6 responses)

·  Improved understanding about and planning for knowledge management (4 responses)

·  p-shift (3 responses), alignment and interface of programmes, including signature programmes

·  PQAT (3 responses), orientation and experience sharing

[see annex 3 for full set of expectations]

The discussion accepted the workshop objectives and appreciated the expectations sent ahead by a number of the participants. The opening words of the Regional Director stimulated some discussion that concurred with her about the issue of simple and useful language.

“Coming back to language, there is the problem that ‘buzz words’ come and go. ‘Impact’ is now one of these words, but do we have a common understanding? E.g., some donors are asking us to measure ‘impacts’ for 6-month interventions.” Garth

[see IMRA Companion DVD 3 for standard definitions of impact for CARE – English and French]

Country Office and CI member updates - Gallery walk of posters

Participants from country and CARE International offices came with prepared posters to share their recent achievements in the programme approach and impact measurement strengths.

A gallery walk was organised with the following guiding questions to be used by the participants as they moved around during the exercise:

a)  Common patterns – what common patterns of strengths, activities are evident

b)  Resources – what are some strengths in other offices that could be a resource for addressing gaps in your own systems (Post cards about the resource and your name).

Responses and discussion

Common patterns

Strengths - We have an opportunity to become serious about impact measurement in the programme approach, at programme level. Leadership seems to be committed to impact tracking throughout the organization. Offices are striving to know about impacts and be accountable to impact groups. There appears to be some form of Programme Quality and Learning unit in all the country offices. Structural reviews have been initiated in number of country offices. Knowledge technology – SharePoint sites are coming in several offices.

Limitations – Country offices are struggling to implement the programme approach in the context of recurrent emergency projects/needs. There are also common gaps in analytical skills, e.g., for reviews and reflection, monitoring and evaluation. Many offices lack sufficient storage or integrated data bases. Internet connectivity is often poor, and contributes to silo effects. Another area is knowledge management – many offices recognize the need to improve and coordinate knowledge management.

There were many ideas about what could be done if more resources would become available. Within programs, M&E activities are being budgeted. However, programme quality and impact tracking activities are generally not being donor-funded. CARE International members are resources on funding and technical support, but mostly to their ‘own priority programme countries’. Some CI members are working toward operationalizing impact indicators, including CARE Norway, CARE UK, CARE Austria, and CARE USA.

The participants were able to be self-critical and challenging, as in the following observation:

“The posters seem to reflect some CARE centrism - what about partners?”

There was also hope for more learning activities - exchange learning visits, both internal & external; more time for reflection and learning; and better coordinated technical support. And finally, a desire for more support to impact tracking – integrate p-shift and impact tracking into existing emergency projects; and enable validation of analyses of underlying causes of poverty and theories of change.

Overview of IMRA findings

Impact Measurement Readiness Assessment (IMRA): East and Central Africa Region – summary presentation by Tom

Background and rationale

Recognising that addressing the underlying causes of poverty requires a long term commitment, CARE began in mid-2008 to shift from a project focus to a longer term programme approach. As a result of this change, CARE’s traditional ways of measuring results (by focussing on outputs) are no longer sufficient. CARE USA is now developing a new impact measurement strategy, intending to enable CARE and its partners to measure and track their contribution to creating sustainable change in the lives of specific impact populations of highly vulnerable people in poor communities.

Objective of consultancy

The purpose of this consultancy was to carry out an impact measurement capacity/readiness assessment among ECA COs to inform the development of the new CARE USA impact measurement strategy in a way that will be meaningful, feasible and cost-effective for the COs, their development partners, and their impact groups. More specifically, the impact measurement capacity assessment was designed to assess: 1) The ECA COs’ current/existing capacities to measure impact; and 2) What it will take to ensure COs’ readiness to effectively implement the new impact measurement strategy. This involved assessing the following in each CO: a) Existing M&E systems, processes and practices; b) Strengths and weaknesses of existing Knowledge Management (KM); c) Good practices; and d) Gaps to meet requirements of the new strategy.

Methodology

The IMRA was conducted through a combination of: a) self-assessment tools completed by the COs (using two tools – one a qualitative questionnaire on impact readiness and one a ranking tool on DM&E capacity); b) documents review of an extensive set of documents provided by each CO (related to IM, DM&E, p-shift, etc.), and site visits to all COs. The site visits included individual interviews and group meetings with key programme and programme support managers and staff. In most of the countries, it was possible to visit a field office for discussions at that level with project and programme teams, including some partners.

The analysis was carried out with a qualitative content analysis approach, blending ideas from a guiding thematic framework included in the ToRs with emerging issues from the field. Preliminary analyses were done in each country and shared back with the CO teams in debriefing sessions within the CO visit, followed by brief summary reports based on the CO group work.

Summary of thematic issues recommendations

Three major themes have emerged from the assessment of country offices readiness to step up the scale and quality of tracking impact as they make the shift to a programme approach.

The first of the major themes is DM&E capacity and competence, which is linked to the selection, collection and analysis of data to produce information. The self-assessments of the COs tended to rate their performance in this area as ‘moderate’, at least for output information, but there were many gaps in performance around tracking and analysing outcome/effects and impact level data/information. A particularly important concern is the tendency to mystify impact, regarding it as a numerical product, and only able to be demonstrated by highly technical baselines and endlines.

The second major theme is KM capacity and competence, which is linked to the production, capture and use of knowledge within the COs and their programmes, and thus, within the organisation and its partners. Although there was some consideration of this issue in the original ToRs for the IMRA exercise, so many issues and challenges arose that it can be regarded as an emergent area of strong concern. It is also a system area that is under-developed in CARE, compared to many other systems. The most critical concern in this area is the absence of any functioning KM strategies or policies in all COs (and at other levels – RMU and CUSA).

The third major theme was originally identified as PQ/L strategies and tools, but further reflection on findings from the field suggests that it is mainly about Quality Assurance and Programme Support. It is linked to the production and application of credible evidence. Ideally, this evidence would be built on good/best practices, quality data and information, and relevant documentary and experiential knowledge. There are, however, significant gaps in quality assurance and reflection measures affecting DM&E and KM, and therefore, the quality of evidence about impacts of CARE’s work. Of particular concern is also the need for holistic, systems-oriented capacity building support to programme quality, including DM&E, KM, and IM – which will require substantive involvement and transformation in programme and programme support.

Emerging recommendations and implications

The recommendations arising from this IMRA have been articulated to address the IM challenges of the ECARMU COs, plus the RMU and CUSA as it relates to them. Meanwhile, there is evidence suggesting that similar issues pervade the IM competency of all of CARE’s COs, including IMRA exercises conducted during 2010 in Malawi (SARMU) plus Mali and Niger (WARMU).

The FY10 PQAT analysis showed IM needs improvement the most

“Impact measurement & learning: Programs are struggling with how to measure their impact and capture and share ongoing learning, in particular how to integrate these processes across the various projects and other initiatives contributing to the program. This area had the lowest rating overall.”

The following two core recommendations are overarching; they span all of the analysis of strengths, challenges and recommendations:

·  CARE International, including all CI members, RMUs and COs to reflect on the issues and recommendations of this analysis followed by strategic prioritisation and resourcing of systemic capacity building toward IM competence as an organisation

·  CO PQ/L teams, with ECARMU & programme partners to develop coherent and systemic CO capacity building plans for enhancing IM (and its related system components of M&E, KM and PQ/L) that will include holistic attention to building on strengths, addressing gaps and making improvements in tools, skills/knowledge, staff/infrastructure, and organisational structures/systems/roles.

[see ANNEX 4 for executive summary of the IMRA report with further details of recommendations and way forward]

As a closing quote, Tom reminded the participants about our collective aspirations from the last RPQG meeting, which occurred at about the halfway point in the IMRA exercise.