1
Final evaluation of the UNECE-led UNDA project on the implementation of Agricultural Quality Standards
TERMINAL EVALUATION REPORT
Sections Paragraphs
I. Introduction1-8
Purpose, mandate and scope of evaluation.
II.Methodology of the evaluation 9-13
Evaluative standards and norms, data collection and analysis,
benchmarks, limitations and challenges.
III.Overview of the survey returns14-27
IV.Findings 28-57
IV.A. Relevance
IV.B.Effectiveness
IV.C. Efficiency
IV.D. Impact
IV.E.Sustainability
V.Challenges 58-62
VI.Conclusions 63-66
VII. Recommendations67-72
ANNEXES:
Annex 1: Summary of 34 responses to e-surveys in Russian of workshops’ participants in Moldova, Anapa (RF), Kyrgyzstan and Tajikistan
Annex 2: Summary of 55 responses to the online surveys of participants of workshops in Nairobi, Kenya (April 2009), Cape Town, South Africa (April 2010), Santiago, Chile (February 2011), Accra, Ghana (September 2011), Dubrovnik, Croatia (October 2011) and Chiang Mai, Thailand, (November 2011)
Annex 3: Summary of 19 responses to the online surveys of participants in workshops in Indonesia (October 2010) and Egypt (March 2009)
Drafted by Vladislav Guerassev;16 January 2012
I. Introduction
1. The purpose of this evaluation exercise stems from the requirement stipulated on pages 15-18 of the UNECE “Guide for project managers(November 2010)”[1] where it is underscored that evaluation is a critical management tool and that the final evaluation is necessary for providing an opportunity to validate the logic of a project, the synergy of its activities and the effectiveness of their implementation, to learn lessons for the future and to make adjustments as needed. The “Guide” also identifies evaluation criteria and relevant evaluation questions that informed and steered this exercise.
2. This purpose is also underscored in “Guidelines for JointDevelopment Account Projects”[2] as follows: “Evaluation is becoming increasingly important at the UN Secretariat. Although it has not been made mandatory, the practice has been established for DA projects to be evaluated by an external evaluator upon their completion.”
3. The mandate of the evaluation is provided in the “Terms of Reference” approved in conjunction with the UNECE consultancy contract No. 34796 of 15 November 2011.
4. The scope of the evaluation covers the UNECE Development Account project “08/09B: Enhancing capacity of developing countries to implement international standards for commercial agricultural products in order to improve their trade competitiveness”[3] with a total funding of US$703,000. It is one of the 32 projects of tranche 6 of the UN Development Account: "Supporting the implementation of internationally agreed goals through innovation, networking and knowledge management".[4]The total funding for 32 projects of tranche 6 amounts to US$18,651,300[5]; thus the project under evaluation (hereafter referred to as the Project) comprised, financially, 3.8 per cent of tranche 6 and was about 20% higher than the average size of a project of this tranche.
5. The background to the Project[6] is that most developing and transition economies lack the resources tomeet internationally agreed commercial agricultural quality standards which puts them into a competitivedisadvantage in relevant exports because compliance withthose standards is a condition for granting access todeveloped-economy markets. The application of internationally agreed standards bydeveloping countries, and especially by least developed countries, would facilitate the export oftheir agricultural products, stimulate the development of the agricultural sector and raise averageincomes, especially among rural populations.
6. At the time of its inception, it was envisaged that this ECE-led Project would draw upon the long-standing expertise ofECE in the development and practical use of agricultural quality standards in the region andworldwide. By 2007, 36 UNECE standards for fresh fruit and vegetables had been incorporated into the European Union legislation and had become obligatory at all stages of distribution (export, import, wholesale and retail trade). Since 2010, European Union legislation provides the legal basis for member countries to use all UNECE standards for fresh fruit and vegetables. Ten of the most traded products in the EU follow standards that are fully harmonized with UNECE standards. The project would build capacity in poor and low-income economies for the implementation of international commercial quality standards and sanitary and phytosanitary measures for agricultural products.It was also envisaged that the project would draw upon theexpertise of ECA, ECLAC, ESCAP and ESCWA on agricultural trade in their regions.The project would also take advantage of the expertise and technical assistance of UNCTAD and of collaboration with other international and national agencies, including FAO/WHO Codex Alimentarius and UNIDO.
7. It was also envisaged that the networks of national and regional counterparts would be established under the Project to worktogether with public and private stakeholders (experts, producers, exporters, traders, processors,etc.) from countries with advanced agricultural export/import sectors, including low- and middleincomecountries. Those stakeholders would participate in country-specific assessments, to befollowed by regional workshops to share lessons learned amongst all countries in the region and toformulate recommendations. The workshops would also serve as capacity-building activities.
8. The breakdown ofthe project’s budget is as follows[7]:
ObjectUS$ per cent of the total
GTA/Experts/Consultants 179,500 25.5
Travel of staff 69,000 9.8
Contractual services 32,000 4.7
Operating expenses 4,500 0.6
Grants418,000 59.4
Total 703.000100.0
By the end of 2011, US$666,339 was expended for the implementation rate of 95 per cent. It was noted that the allotment for the implementation of the project was transferred from headquarters to UNECE with a six-month delay.
II.Methodology of the evaluation
9. The evaluation followed the standards provided in the normative documents regulating the conduct of evaluations at the UN Secretariat, namely: “Managing for results: a guide to using evaluation in the United Nations Secretariat”;[8]OIOS “Inspection and Evaluation Manual”[9]; and UNEG’s “Norms for Evaluation in the UN System”[10]and “Standards for Evaluation in the UN System”[11]. The evaluator isalso guided by the authoritative document “Guide for project managers(November 2010)”[12] which assists the UNECE staff in the preparation, implementation, monitoring and evaluation of technical cooperation projects.
10. In order to establish a sound data base for the evaluation, three different data collection methods were used: (a) desk review of all pertinent documentation both at the core and on the periphery of the Project; (b) administering the beneficiary survey to the complete universe of the participants of the Project’s workshops; (c) interview and discussion with the management of the project. All these instruments and data comprise the records of this evaluation with clear audit trails.
11. The initial desk review of all the publications and records pertaining to the Project was aimed at accumulating the sum of knowledge specific to the Project that would allow to form an initial impression of its comparative advantages vis-à-vis efforts of other developmental actors within the same or similar thematic or geographic areas. It also served to formulate the informed and specific questions for questionnaires and interviews. The evaluation used uniform design (with minor customizations) for the email questionnaire and two e-surveys addressed to the participants of 12 workshops delivered by the Project. They were sent to all the participants of these workshops that have registered e-mails. The returns of the surveys are discussed below.
12. The evaluation used, ipso facto, the approved logical framework of the Project[13] as both its benchmark and yardstick in assessing its performance.
13. The limitations and challenges of this evaluation were largely dictated by the financial constraints and time-limits for the exercise that did not allow for on-site interviews with actual and potential stakeholders and beneficiaries. Therefore, this evaluation is not a comprehensive assessment of all aspects of the outcomes and impact of outputs produced by the Project and of effectiveness of all its procedures and processes. This is not only due to the time and resource limits but also because it was possible to adduce the performance evidence on some but not on all of its outputs and activities. Therefore, it focuses primarily on those elements considered of greatest relevance, in the opinion of the evaluator, for the sound implementation of the Project.Nevertheless, these limitations were largely mitigated by the wealth of quantitative and qualitative information obtained through surveysadministered during the evaluation and employing the common techniques of triangulation and extrapolation to eliminate any possible intrinsic biases and to ensure reasonable validity and credibility of evaluative analysis and findings.
III.Overview of the survey returns
14. During the period 5-20 December 2011, electronic questionnaires were disseminated to 536 participants of 12 workshops delivered by the Project. Questionnaires were grouped into three surveys. The first one was administered in Russian through direct email to 85 participants of 4 workshops that were conducted in the former republics of the Soviet Union, namely in Osh, Kyrgyzstan (July 2009), Anapa, Russia (October 2010), Khudzhand,Tajikistan(September 2011) and Chisinau, Moldova (October 2011). The other two e-surveys were administered online. The second questionnaire went to 360 participants of six workshops in Nairobi, Kenya (April 2009), Cape Town, South Africa, (April 2010), Santiago, Chile (February 2011), Accra, Ghana (September 2011), Dubrovnik, Croatia (October 2011) and Chiang Mai, Thailand, (November 2011). The third (and last) questionnaire was sent to 91 participants of two workshops in Egypt (March 2009) and Indonesia (October 2010). The originals of all three questionnaires are on file at UNECE.
15. By the deadline of 23 December 2011, a total of 108 replies to all three questionnaires were received[14], including 34 replies to the first questionnaire, 55 replies and 19 replies to the second and third ones.The overall average response rate amounts to 20.1% varying from 40% for the first survey to 15.3% for the second one and 21% for the third one. The higher rate of response to the first one is explained by the fact that its delivery was personalized and followed-up by similarly personalized reminders. This is quite a labour-intensive and time-consuming method of a questionnaire delivery, and for that reason it was not used for the other two questionnaires – they were administered more cost-effectively online. The overall response rate of 20.1% is considered quite satisfactory in terms of the statistical reliability and accurateness of representing the overall universe of the workshop participants.[15] The consolidated returns to the three surveys are presented in the relevant Annexes to this report.
16. The feedback from participants was clearly marked by their interest and involvement. Apart from providing ratings on various queries, they also offered 184 voluntary comments, suggestions and opinions that were entirely optional.
17. Of the 108 respondents, 57 (or 53%) held positions in Government, 17 (16%) worked in research, 27 (25%) belonged to the private sector and 8 (6%) held other occupations such as consulting, etc. Concerning their position, 46 (or 43%) were managers, 52 (48%) were technical experts, 5 respondents (4.5%) were traders and another 5 (4.5%) were growers. By nature of their activities, 30.4% of respondents were involved in developing standards and setting up the legal and technical infrastructure for their adoption and implementation, 26.4% dealt with inspecting products for conformity with quality requirements, another 22.3% were engaged in promotingthe practical application of standards by producers and traders and the remaining 20.9% were training inspectors, producers and traders. Overall, the respondents appear to represent fairly well the target audience of the UNECE training delivered through the Project workshops.
18. Concerning the outcome of the workshops[16]:
- 85.2% of the respondents fully agreed that the workshops helped them better understand the importance of international standards for growers and traders, while 13.9% of them only somewhat agreed with this view and 0.9% had no opinion on this point;
- 70.7% of the respondents fully agreed that during the workshops they learned about the legal and technical basis necessary for implementing standards, while 23.6% of them agreed somewhat with this view, 3.8% had no opinion on this point and 1.9% disagreed with such an assessment;
- 76.1% of the respondents fully agreed that the workshops resulted in them better understanding how to work with standards in practice, while 20.3% of them agreed somewhat with this view, 1.8% had no opinion on this point and 1.8% disagreed with such an assessment;
- 72.1% of the respondents fully agreed that the workshops helped them to establish professional contacts in the region and beyond and 21.7% of them agreed somewhat with this view, whereas 4.3% had no opinion on this point and 1.9% disagreed with such an assessment;
- The reply was a bit less upbeat as to whether the workshops allowed the participants to contribute to the development of international standards and explanatory material. 46.3% of them fully agreed that they did and 31.5% agreed somewhat with this assessment, whereas 15.7% neither agreed nor disagreed with this view and 6.5% disagreed with it;
- 47.2% of the respondents found their involvement in the discussion of UNECE standards and explanatory brochures effective and useful and 14.6% partially agreed with them, whereas 38.2% neither agreed nor disagreed with this positive assessment;
- 69.7% of the respondents were of the view that work on the content of standards and brochures should be part of future workshops and 25.8% partially supported this view, whereas 4.5% disagreed with it; and
- 54.6% of the respondents believed that the conclusions and recommendations of the workshops provided them with good guidance for their work on implementing standards and a further 41.7% of them somewhat agreed with this view, whereas 2.8% neither agreed, nor disagreed and 0.9% disagreed with this view.
19. The experience showed that the language of the last two questions in this part was somewhat ambiguous and that a certain part of the respondents was confused about the fact that the positive response to the question meant a negative assessment of the subject. This was clearly confirmed when individual replies to these to question were juxtaposed against their replies to the previous one. Due to this flaw, for which the evaluator bears responsibility, there was no point in discussing returns on the last two questions in table 4 of the questionnaires.
20. Overall, participants gave overwhelmingly positive assessments of the outcome of the workshops as evidenced by the share of the first two replies (fully agree and somewhat agree) regarding the result of the workshops: 99.1% of them felt that they better understood the importance of international standards, 94.3%learned about the legal and technical basis for implementing standards, 96.4%gainedbetter understanding of how to work with standards in practice, 93.8%established professional contacts in the course of the workshops and for 96.3% of the participants the conclusions and recommendations of the workshops provided good guidance intheir work on implementing standards. Concerning the work on the content of standards and brochures, 95.5% of the respondents felt that it should be on the agenda of future workshops.
21. The above positive assessments are further supported by the respondents’ views on how the workshops accomplished their main objectives: 55.1% gave an “excellent” rating and 40.2% gave “good” in regard to the objective of “sharing knowledge on problems and best practices in using quality standards in various countries and regions” (3.8% gave “satisfactory” and 0.9% gave “poor” ratings). Similarly, 39.0% of respondents returned the “excellent” rating and 43.8% returned the “good” one regarding the workshops’ achieving the objective of improving their capabilities to implement quality standards in their countries (16.2% gave “satisfactory” and 1.0% gave a “poor” rating).
22. Only 61.8% of respondents fully or partially believed that their involvement in the discussion of UNECE standards and explanatory brochures effective and useful, whereas 38.2% neither agreed nor disagreed with this positive assessment. This could be attributed to the fact that only two workshops (those in Ghana and Thailand) included explicitly in their programmes topics directly related to work on the texts of standards and brochures. While this assessment is unquestionably positive, it indicates that there is still some room for improvement in the interactivity and the participatory aspect of the workshop.
23. The respondents were unambiguously positive regarding whether the combination of theoretical and practical (training) sessions was well balanced and whether enough time was dedicated to theoretical sessions: 94.3% of them were of such a view; 96.5% of them found technical visits useful and instructive and 17.5% of them felt that more time should have been devoted to the practical sessions.
24. More than 55 written comments were provided in response to the question “What was the most valuable information/know-how you took away from this workshop?”(for the first questionnaire – Annex 1 – responses were given in Russian and they are translated into English in Annex 1, see blue font). Among the most notable of these - that illustrate the highly beneficial capacity building outcome of the workshop - were the following:
- “The information on the structure and activities of the quality inspectorates in the EU countries and CIS countries; Practical approaches to quality assessments – I obtained comprehensive information about the status of explanatory brochures for the UNECE standards and appreciated the importance of their application in conjunction with the standards”;
- “The prospects for the creation of the unitary legal and normative base in the agricultural area were explained. Based on this, we chose to adhere to UNECE standards in developing internal regulations. This will promote the common understanding of quality controls between producers and consumers”;
- “Training on the content of the standards and on their application are being used by me in the development of the national standards such as the one on nuts and its subclasses”;
- “I was much appreciative of the practical approaches to the assessment of the quality of products by experts and became convinced once again in the importance and utility of publishing the explanatory brochures for the UNECE standards and their application in conjunction with the standards”;
- “The information that every standard is accompanied by an explanatory brochure was a real discovery for me; the workshop helped to build capacity in the development of illustrated brochures”;
- “Since we are exporters to EC countries, the categories of quality for sorting out the nuts and dried prunes [presented at the seminar] were important for our business”;
- “The practical aspects of implementing trade standards for export markets and the necessity of creating the adequate legal and technical infrastructure for quality controls on compliance with the UNECE standards were very enlightening”
- “The most valuable information taken away from this workshop was how the application of marketing quality standards tremendously improved the Kenyan fruit and vegetable industry leading to increased volumes of exports as well as increased incomes and reduction in poverty”;
- “Knowledge on the classification of the products was crucial and the information on how inspectors inspect the products and how they classify it. I transferred this knowledge to agriculture cooperatives in Bosnia and Herzegovina that export products to other countries”
- “The most valuable information we took away from this workshop is the appropriate management of seed potatoes certification based on the internationally recognized standard for ensuring high quality of seed. Also we have been informed precisely about the UNECE Standard for Seed Potatoes especially as potatoes have become a major and strategic crop in countries. Likewise the discussion conducted on the main potato pests and diseases was so effective. The knowledge learned from the UNECE Standard for seed potatoes could be useful to help us in the Lebanon in order to reinforce legislation concerning the production of certified potato seeds”; and
- “Most valuable information was tolerances of other countries. I could not obtain these data from scientific reports. Now I can compare our system with that of other countries”.
25. Concerning the Project’s website ( it was rated “excellent” by 41.7% of respondentsand “good” by 50.9% of themin regard to its ease of access to standards, meeting documents and workshops material (6.5% gave “satisfactory” and 0.9%gave “poor” ratings). In regard to the usefulness of its content, the website was rated as “excellent” by 40.7% of respondentsand “good” by 50.9% of them (8.4% gave “satisfactory rating”). Among the most notable written comments about the website were the suggestions to: