eLearning Africa 2007

Nairobi, Kenya

Interactive workshop:

Monitoring and Evaluation of ICT4E initiatives

Facilitators: David Hollow and Tim Unwin (the ICT4D Collective and UNESCO Centre for ICT4D, Royal Holloway, University of London)

Monday 28th May, 14:00-18:00

Workshop Aims

The workshop was designed to explore key issues in designing and delivering effective M&E (monitoring and evaluation) of ICT4E (ICT for education) initiatives in Africa. Participants formed three groups for the breakout sessions on baselines, impact on learning and donors. The sessions were each split into two sections – the first discussion regarding identification of the key problems pertaining to the topic, and the second discussion regarding potential solutions to the identified problems.

The reports from each session are included below, together with an additional list of comments regarding effective M&E of ICT for education.

Breakout workshop 1 – Baselines

Problems and Challenges Identified

  • Ensuring adequate access to information is difficult – especially when attempting to collect data from government organisations and people who do not want to disclose information for political or financial reasons.
  • Conducting an effective baseline survey is costly. Implementers are often reluctant to commit the required money due to limited project funds.
  • The methodological approach employed is often not appropriate – for example, a questionnaire may not be the best way to gather information from a low literate community.
  • Gathering information from a community without involving gate keepers can prove problematic as they may seek to thwart the efforts of the researcher.
  • Those from whom data is being collected may be unwilling to participate without some form of incentive. (An example was cited where one of the people being sampled asked the interviewer if he was paid to do the baseline - when the interviewer answered in the affirmative he asked why he should not then be paid for providing information.)
  • It is difficult to check the validity of the responses given. (Cases were mentioned where people provided wrong information for their own purposes. The answers they gave were contrary to the realities on ground.)
  • The baseline data collection can be limited by the structure of the project evaluation framework.
  • The M&E plan is often not put in place at the beginning of the project but is added as an afterthought – potentially in order to justify the project and satisfy donors.
  • Baselines can be manipulated to suit different agendas – especially where the pressure to maintain funding affects the objectivity of the implementers.
  • There is an overemphasis on quantitative methods when constructing baselines – this leads to a one-sided perspective on reality.

Possible Solutions

  • Projects should be based on a concrete baseline. If the design of a project is founded on baseline outputs then it is more likely to be successful.
  • A baseline is to M&E what a foundation is to a house. A strong foundation will determine strength and durability and it should therefore be accepted policy to have the baseline well defined before funding is received .
  • Incorporating more qualitative methods into data gathering for baseline would provide a richer picture of the situation.
  • Researchers should employ innovative approaches when gathering data – establishing ways to gather information from respondents without pre-empting their answers.
  • Significant background research should be conducted before carrying out a baseline. Establishing the background of the community, previous research conducted and available literature all combine to provide a stronger base for the survey.
  • Increased consideration should be given to the ethical dimensions of conducting a baseline survey.
  • All stakeholders should be informed of the purpose for the baseline and should give informed consent for this.
  • The findings of the baseline should always be disclosed to the stakeholders.
  • Gate keepers/ respected individuals within the target community are a useful source for gathering information for the baseline.
  • Validity is increased through ensuring that the sample for the baseline is truly representative of the target audience for a project.
  • The widely accepted ‘bottom-up’ approach to a programme should also be adopted when considering and constructing the baseline.
  • Sustained cooperation of the respondents in gathering baseline data may require motivation or appropriate incentives.

Breakout workshop 2 – Impact on learning

Problems and Challenges Identified

  • There is considerable frustration in measuring the direct impact of ICT of learning – much impact will be seen only after the intervention in the future. With education you invest today and harvest in 15 years.
  • Teachers themselves are confused as to the impact on learning from ICT and so are unlikely to make maximum use of the benefit.
  • It is impossible to assess impact on learning based only on measuring in the classroom – many things are missed by doing this because ICT is a 21st century learning skill that goes far beyond the classroom.
  • It is difficult to define impact of ICT on learning with quantifiable measurements – can be in the changing of mindsets and introduction of new approaches of thinking.
  • It is expensive to assess impact on learning – people are unwilling to make this expense because of the lack of strong evidence that there is an impact.
  • There is difficultly in quantifying return on investment/value for money when considering impact on learning.

Possible Solutions

  • The objective of the intervention must be very clear in order to know what to measure in ascertaining impact.
  • Impact on learning should be viewed long term. (Example of building a bridge that lasts for 15 years – additional benefits come only once it is constructed.)
  • Any project must begin by identifying the problem with the learning and only then considering how to use ICT to solve the problem.
  • There is great potential benefit in utilising quantitative data that has already been collected – such as from governments or international bodies.
  • When considering impact on learning it is important to be prepared for unexpected outcomes – both within and beyond the project.
  • The focus must be on impact of learning – not input-based impact of technology.
  • It is important to use broad definitions of learning that include outside the classroom and student capacity development. This is dependent on developing appropriate indicators for 21st century skills.
  • Having a broader conception of how impact on learning is measured would benefit projects – most current measurement instruments do not facilitate this.

Breakout workshop 3 – Donors

Problems and Challenges Identified

  • There is an unequal balance of power within many projects – the current norm is for the donor to dictate the rules and then impose them on the recipient.
  • A conflict of interest often exists between donor and recipient in regard to project objectives, implementation and impact.
  • It is often not clarified by the donor how much of the budget is allocated for M&E.
  • Having a limited project lifespan is not always in the best interests of sustainability.
  • Working within an imposed project management framework can hamper project effectiveness.
  • Sustainability is not normally conceived in terms of human capacity development.
  • Creativity and innovation are often overlooked in favour of focusing on infrastructural issues.

Possible Solutions

  • There should be genuine partnership between donor and recipient with equal ownership of terms of reference and a template on mutual accountability.
  • Donors should take increased account of local conditions as they can have significant impact upon the required equipment specification.
  • All M&E exercises should have a clearly agreed and predetermined budget.
  • There should be clear agreement regarding power issues – ie ‘who decides what’.
  • There should be agreed understanding of when a project actually ends.
  • Demand-driven projects should be encouraged by donors.
  • Counterpart contributions (on the part of the recipient) should be valued and their monetary value should be quantified in some way
  • There should be greater adherence to the Paris declaration of OECD countries for recipients.
  • The donors should be valued not just for their financial contribution but also for the valuable experience that they offer.
  • Projects can be more effective with the appointment of an independent M&E consultant.

General observations regarding effective M&E of ICT for Education.

  • The use of ICT in education has brought lots of new challenges to teachers – they are not all ICT experts and just want to get on with teaching and M&E must take account of this.
  • Many students have more advanced ICT ability than their teachers. Often teachers are not fully aware of what students are doing with ICT in their school.
  • M&E of ICT is a very broad topic – it is important to clarify what particular technologies we are talking about in each instance.
  • All interventions must be led by an educational focus – not a technology based agenda.
  • The M&E exercise itself can be greatly assisted by ICT. Internet based research makes people easy to track and reduces travel costs, and data can be collated and disseminated easily to stakeholders.
  • Who owns the M&E process and report – donor or recipient?
  • Who decides what are the key outcomes for measurement - donor or recipient?
  • The lack of money for M&E means that many reports are unfinished because the funds run out before completion.
  • Dissemination of M&E findings to all stakeholders is often neglected.
  • We must differentiate between internal and external evaluation.
  • It is difficult to ensure the objectivity of the measurements in M&E.
  • There is a tendency to only publish M&E of good projects – those that fail are rarely written about or given attention.
  • We need to decide what is ‘good enough’ in the midst of financial constraints – M&E only constitutes a small percentage of the budget.
  • Data from M&E is not valid indefinitely – what is the ‘expiry date’ of your measurements?
  • Attention should be given to forward thinking in project planning – thinking more long term about M&E.
  • Sometimes there is no money within the budget available for M&E.