This toolkit was originally developed by PC Burkina Faso. The original has been revised by OPATS for wide use and to include global health indicators. Please revise to align with your programming as needed.
Volunteer and Counterpart Handbook
Measuring Our Results
Table of Contents
Purpose of the Handbook
Developing a way to measure results
Step One: Assess what you are doing by topic or issue area
Step Two: Select the key results you and your counterparts agree to measure
Step Three: Decide how to measure your key results
Step Four: Develop tools to measure your key results
Step Five: Practice using the measurement tools
Step Six: Improve the tools based on how well they worked in practice
Step Seven: Pilot test the revised tools again
Step Eight: Discuss the results of the pilot test and improve tools some more
Step Nine: Develop an action plan to measure the key results of your work
Step Ten: Carry out your action plan
Coordinating and supervising a survey effort
Compiling and summarizing what you learned from your survey/measurement effort
Sharing findings with your community
Sharing findings with Peace Corps
Using your findings to improve your work
Using your findings to strengthen funding proposals
Appendix
Appendix A: Detailed facilitator’s agenda for 5 day ‘measuring success’ workshop
Appendix B: Glossary
Appendix C: Malaria survey
Appendix D: HIV/AIDS survey
Purpose
The purpose of this draft handbook is to assist Peace Corps Volunteers, their counterparts and members of their communities to better measure the results of their efforts. It is based on a 5 day workshop conducted in April of 2007 with Volunteers and counterparts active in community health promotion. As a result, this first draft of the handbook contains training session ideas as well as questionnaires and data collection tools designed and piloted by the participants of that workshop. Please continue to add to this handbook to make it more useful to Volunteers working throughout Burkina Faso.
Developing a way to measure results
Step One: Assess what you are doing by topic or issue area
Volunteers and their counterparts should work together in pairs to list their main health education or other activities in the past year.
· What types of health education or other activities have you undertaken in the past year?
· What was the focus of the activity--what messages?
· How many times did you do the activity?
· How many people did you reach through the activity?
To see the breadth of topic areas (e.g., AIDS, malaria, hygiene, etc.) that are being covered by Volunteers and their counterparts, return to the large group and have each Volunteer/Counterpart list their activities by topic area. Take each topic area in turn. Record activities by topic area on flip charts like the one below:
PaludismeActivités
e.g. talk to mothers at the clinic about the importance of sleeping under nets
Step Two: Select the key results you and your counterparts agree to measure
As a group, discuss what outcomes you hope to achieve. Chose one topic area (e.g., AIDS) to do together as a group discussion, asking the following questions:
· What type of change are we trying to encourage? (improved knowledge, attitude, behavior)
· Who is the target population? (pregnant women, youth, men, etc.)
Have two Volunteer/Counterpart pairs ask each other about the kind of results they want from their work in health education or other areas. Participants then briefly present the results of their interviews to the group. Record desired results/outcomes on flipcharts like the one below:
PaludismeActivités
e.g. talk to mothers at the clinic about the importance of sleeping under nets / Résultats Désirés
e.g. more women and children will sleep under nets
Facilitator then takes a few of the desired results for one health topic or issue area and looks at them carefully, asking these questions to the group:
· What kind of outcome is this—a change in knowledge? Skills? Attitude? Behavior? Condition?
· Is this outcome a realistic result of the activity?
· Is this outcome something we all agree is very important?
· Is this outcome something that more than one of us is hoping to achieve?
· Should we call this outcome a key outcome?
· How will we know when this outcome has been achieved?
Step Three: Decide how to measure your key results
Look at each desired result and discuss how you would measure whether or not it is being accomplished. If it is a change in knowledge or awareness, you will need to ask people questions about their awareness and knowledge. If it is a change in behavior, discuss how you can collect this data, i.e. through observation, or simply reported practices via questions to target population. Examples of observation include checking on presence of soap where family members wash their hands; checking to see the latrine or how drinking water is being stored in the household. Examples of reported behavior include asking individuals about the actions they took during an illness in the weeks preceding the survey; or child feeding in the 24 hours, or week prior to the survey. Health providers can also be interviewed as they can speak to trends in care-seeking behavior, and provide data showing use of health services.
If the change is knowledge, you will need to discuss these questions to decide on how you will go about measuring the change in knowledge:
· Who will you need to interview/ask?
· Where is the best place to do this?
· When is the best time to do this?
· How should you do these interviews/questions?
· What question(s) will you ask them to assess their knowledge/awareness?
Step Four: Develop tools to measure your key results
In a big group, take one key outcome and address the following questions:
1. What will tell you that people have increased their knowledge or changed their attitudes or behavior? What questions should you ask them?
PaludismeRésultats Clefs
Par example: People will recognize the symptoms of malaria
What information do we need?
· The level of knowledge throughout the village about the symptoms of malaria: fever cycles every 24 hours, headache, chills/convulsions
What question(s) should we ask?
· “How do you know when someone has malaria? What are the signs?”
Who do we ask?
Where do we ask them?
When do we ask them?
Then have people work in small groups to begin drafting other questions related to assessing other changes in knowledge.
In drafting these questions, bear in mind that in many ways “less is more.” This means that you should only ask a question you are very sure will yield useful information. The goal should be to ask as few questions as you can to get the information you absolutely need. This will make the overall effort much more efficient for you and for the people you need to interview, and it will make managing all the information you get much easier.
Step Five: Practice using the measurement tools
Once you have a draft of the questionnaire/survey instrument, it will be useful to practice using it before you head out to interview people for real. To practice it, have two volunteers—one interviewer and one respondent—agree to model the interviews. The respondent should pretend to be someone in a village, sitting at home, not expecting to be approached by someone asking for an interview. The interviewer should approach the villager, greet her appropriately, explain why s/he is there, and ask permission to do an interview.
The facilitator of the workshop should stop the role play at this time. Ask the large group to reflect on how well the interviewer approached the beginning of this interview:
Q: What did you think about how the interviewer introduced himself?
Q: Was the purpose of the interview clearly explained?
Q: Did the interviewer ask permission to do the interview?
Q: Did the interviewer set an appropriate tone for the interview/put the respondent at ease?
Q: What could the interviewer do better next time?
After this discussion, the interview continues, question by question. The facilitator should wait for the question to be asked and answered, and for the interviewer to try to record the response, then stop it and ask the group to reflect again.
Q: How clear was the question?
Q: How clear was the response?
Q: Could the interviewer have done anything to prompt a clearer or more complete response?
Q: Is this question in the right place? Should it be asked earlier or later in the interview?
Continue this process until all the questions have been asked and answered.
Step Six: Improve the tools based on how well they worked in practice
Make revisions to the questions based on the lessons from the previous role play exercise. The objective is to refine the questions to make them clear, and to put them in an appropriate order. If there is time suggest that the group include any instructions to the interviewer on the questionnaire. For example, if the answer is no, skip to question 5.
Step Seven: Pilot test the revised tools again
In a large group setting, discuss the purpose of the pilot test, if you can do one.
What is the purpose of your pilot test? What are you testing, exactly?
Allow for some group discussion of this, and emphasize the following points about what the pilot test is testing:
· The clarify of the questions. Do you have to rephrase or restate the question too much? Is it clear the first time you ask it? How can you make it clearer?
· The usefulness of each question. Is it necessary to ask all the questions you have? Can you eliminate any questions?
· The length of the interview. How long does each interview take, on average?
· The availability of people to respond at that time. Are enough people available at the time of day/month/year for this survey?
· The clarity of responses to your questions. If more than one person is listening to the interview, do they each agree on how the respondent answered each question?
· The availability of other data/information. If some of the information you are looking for would come from the records of a health clinic or school, for example, is the specific information you would need available to you in a form you can easily use?
· How easy is it to accurately record responses on your data collection form(s)?
Small group planning of pilot test
Spend some time discussing these questions in your group and coming up with a plan for the pilot test tomorrow.
· How many interviews are you going to attempt to do?
· Where are you going to go when you get there?
· Who is going to do the interviews?
· If translation is necessary, who is going to do the translating?
Materials prep for pilot test
To type up and copy final drafts of the survey/questionnaires as well as any additional data collection tools, get at least one volunteer from each group to do the administrative work of preparing materials for their small group members to use tomorrow. The rest of the group can either break early for the day or meet on other issues.
Implementing the pilot test
Small groups go out, each with a facilitator, to the village to test tools. Each group should try to conduct 5 to 10 interviews. Observers give feedback after each interview. Each group should also look for data as necessary at the health clinic, boutiques, and through observations.
Groups should use the following questions as a guide to evaluate their experience during the field test:
· How did you determine which way to go in the village?
· How many people/households did you interview?
· How did you choose the interviewees?
· How did the introductory part go? (did you put the interviewees at ease? did you ask for their informed consent? did you explain confidentiality? etc.)
· Were the questions understood by the interviewees? If not, why not?
· Is the order of the questions logical? If not, why not?
· Is the length of the survey appropriate? If not, why not?
· Can you eliminate any questions?
· Do you need to add any questions?
· Is the data recording instrument appropriate? If not, why not?
· Did you each record the same responses to each question? If not, why is there disagreement or confusion about how people responded to your questions?
· Did you find relevant data at the health center? If not, what did you do?
· Did you encounter any other difficulties?
· What are your recommendations for improving the process?
Step Eight: Discuss the results of the pilot test and improve tools some more
Small groups meet together first to review how their pilot test went, and to prepare their
summary for a debrief to the larger group. In your debrief to the larger group, focus your
remarks on:
· How many interviews you did.
· The average length of each interview
· Which questions seemed to cause confusion and how you recommend changing those