DRAFT 5, JUNE 20, 2005

Proposed Baseline Studies as part of BESPOR

Revised and Updated June 2005

As a model for WSD evolves and the development of GESDP gets underway, reliable gender-disaggregated baseline data on the current state of affairs is required. This is best done through utilizing and supporting existing systems and in doing so build the local capacity wherever possible. Discussions with personnel in Planning/ EMIS, SQAD and WAEC identified three strands for data collection and analysis to support baseline data. All three strands would be repeated in Years 3 or 4 to measure ‘change’. To this, BESPOR recommended adding a fourth strand, a longitudinal qualitative case study in selected schools in Region 5 where the WSD model is being piloted. These four strands are summarized in the table below.

Strand / Baseline Studies / Responsible
Strand 1 / An National Assessment of EFA in 2005 using quantitative indicators such as enrolment rates by gender, repetition rates, dropout rates, attendance rates, physical facilities, etc which can be directly derived from and monitored using EMIS data. A separate report on Region 5 would be done to support the whole school development pilot. / DoSE Planning/EMIS Director
Lead: Mr. Yunus Hydara with Mr. Alpha of EMIS
Strand 2 / An national assessment of pupils’ performance in Grades 3 and 5 cross the four core subjects to support DoSE efforts to establish national education standards and assess how well the system is performing. The sample size would be 30%.
Additions to the baseline study would include administering the Grade 5 tests across the four core subjects to the pupils’ teachers in order to measure their level of content knowledge. To support this research, pertinent background data on teachers and pupils would be collected at the same time. / Test Development and Analysis: West African Examination Council (WAEC)
Test Administration in close collaboration with DoSE .
Lead: Mr. Abraham Joof of WAEC
Strand 3 / A pre- and post survey of conditions of teaching and learning in lower and upper basic schools in Region 5 to identify factors that impact on student performance and the establishment of an effective education system. Sample size would be 30%, approximately 40 schools of which 20 would be those schools selected to pilot the whole school development model under BESPOR. / Interim Coordinator: Malamin Sonko, M&E Consultant BESPOR with SQAD and GambiaCollege support
Assistant Coordinators: Mr. Sang Gomez of SQAD with GambiaCollege: Sulayman Barry and Amadu Touray and support of Nuha Jatta, Vice-Principal
Strand 4 / A longitudinal case study of 10 to 20 lower and upper basic schools in Region 5 to track qualitative indicators on school and teacher performance and document the process of change as the BESPOR whole school development model is implemented. / DoSE Directorate of Tertiary/Higher Education and Research with GambiaUniversity
Lead: Mr. Musa Sowe, THERD

To support and build the capacity of DoSE, Gambia College, WAEC, and Regional Education Office 5, it is recommended local and international consultants would be contracted for short-term inputs to facilitate the baseline studies. BESPOR M&E Consultants would facilitate the capacity building, monitor quality control and ensure that baseline data and reports are produced in a timely and efficient manner.

Further details for each strand follow. The plan is to share the initial findings of Strands 1, 3 and 4 at a National Conference which is tentatively scheduled for April 2006 but this date may change. Strand 2 would only report progress at that time as data collection would not take place until the end of the school academic year in June or July 2006.

Updated and revised June 20, 2006

Strand 1:

EFA Assessment 2001- 2005/6

The study will assess how well The Gambia is doing in achieving its EFA goals since the last review in 2000. The output would be a National EFA Report that looked at education trends over the past five years, 2001 to 2005/6. It would feed into the EFA Reports and to the development of a sector-wide education programme. To support whole school development, a separate sub-sector report for Region 5 would be produced as part of the baseline for BESPOR. The Reports could be updated annually or bi-annually as new data becomes available.

The assessment would utilize existing EMIS data from 2000 to 2004 and the recently collected school data in basic and secondary education in May 2005. Data entry and data verification are now underwayand are expected to be completed by the end of June. Data from earlier years (2002/3 and 2003/4) is ready. Data for 2000/01 has gaps but the extent is not known. Critical to the analysis is the capturing of the Central Statistics Department’s 2003 population data. This data will most likely be available by region in September 2005.

Data on ECD, Adult Literacy, Non-Formal Education, Technical and Vocational was not collected in 2005 but there are plans to do this in January 2006. At the same time, additional data to enrolment figures will need to be collected from Madrassas to provide a global picture of EFA in the country.

Wherever possible data analysis would be done by the EMIS Staff to build their capacity. This will require, however, intensive training and continuing on-the-job support. A plan put forward by the Directorate of Planning is to contract the EMIS Database developer, Ian Attfield, to come to The Gambia in September 2005 to support the training and data analysis required for EFA Report, and troubleshoot existing EMIS problems. (see attached TORs). To ensure a strong team is in place when the EMIS Consultant leaves, it is recommended that the new VSO Volunteer who would be attached to EMIS would join the training as well as the BESPOR ITC Consultant, Philip Bell. Both would then be in a better position to provide on-going capacity development as the need arises.

Timelines dictate that a draft National Report for the basic and secondary education should be ready by the end of November to feed into the 2005/06 planning and budget processes. This means that data entry, which is in progress, would need to be completed by August and validated. Data analysis would be September and October with data interpretation and writing up of findings in the first part of November. The EFA Secretariat has agreed to support the data interpretation and report writing at the national level. Much of the capacity building would take place in a 5-day workshop in November when the writing teams would be trained in how to analyze the data and draft the report.

The EFA Report on basic and secondary education for Region 5 would be produced in conjunction with EMIS and may include an in-depth data analysis. Staff capacity would be built within the region to validate the data and interpret the findings as part of the capacity development process. This would be done through the workshop mentioned above. The approach would serve two purposes. It would build awareness among Regional officers, cluster monitors, head teachers, and other stakeholders of the potential uses of EMIS as a management tool. It would also build competence within the staff to handle statistics in order to produce reasonably accurate assessment reports for monitoring and planning purposes.

The draft report in November will be incomplete and will not be finalized until the missing data is collected in January 2006, analyzed and integrated into the 5-year report.

Details of tasks and timelines are attached. They were developed in a participatory process with the Mr. Yunus Hydara of the Directorate of Planning and Ms. Mirlene Andre, the Peace Corp Volunteer who has just completed her two-year assignment with EMIS Unit.

Issues:

  1. The Personnel Database has not been updated since 2002 and so data is not reliable. Reconciliation of teacher headcount in January 2005 with EMIS Personnel Database could be done but will require the expertise of the EMIS Consultant.
  2. It will not be possible to report on all performance indicators for EFA due to lack of data; e.g. the teacher:pupil ratio. Alternative means to calculate this will need to be identified.
  3. Financial indicators will prove difficulty to obtain given the backlog in that department.

Revised June 20, 2005

Strand 1:

Tasks and Timelines for EFA Assessment Report

Strand 1: EFA Assessment Study 2001-2005/06
Lead: DoSE Directorate of Planning/EMIS
Responsible: Mr. Mohammed Jallow, Director and Mr. Hydara/Planning-EMIS
Objective: to prepare an assessment report on the status of EFA at the basic education and secondary education levels between 2001 and 2005.
Note: Data on ECD, Adult Literacy, Technical/Vocational and Madrassas was not collected for 2005 but will be collected in January 2006 for the next academic year.
Strand 1: 2005 EFA Assessment
Activities / Timelines / Responsible
1.0 Preparatory Steps
1.1 / Agreement with DoSE/Permanent Secretary and the Planning Directorate to move forward with EFA Assessment Report for 2001-2005. Note: Report would cover only basic and secondary education / June 2005 / BESPOR in consultation with Permanent Secretary and Director of Planning
1.2 / Identification of key EMIS staff to lead the EFA Assessment Report and staff to support the process including report writers. / By 30 June, 2005 / Directorate of Planning: Mr. Hydara and EFA Secretariat: Mr. Sanneh
1.3 / Development TORs for the EMIS Staff to prepare for the EFA Assessment Report / By July 2005 / Directorate of Planning: Mr. Hydara
1.4 / TORs for International Consultant to support development of EMIS data analysis and training (14 days in-country including travel time) / By 30 June 2005 / TORs: Directorate Planning/EMIS with input from BESPOR/Nancy Yildiz
1.5 / Identify funding source for External Consultant / By 30 July 2005 / Directorate of Planning with BESPOR and others
1.6 / Prepare Contract for EMIS Consultant and sign / 15 August, 2005 / To be determined
1.7 / Budget for EFA Assessment prepared (include assessment in 2005 and again in 2008) / 30 June 2005 / Lead: Mr. Hydara of Planning in consultation with Mr. Sanneh of EFA Secretariat and Erling Petersen of BESPOR
1.8 / Budget Approved / By 30 July 2005 / DoSE/Permanent Secretary and BESPOR
2.0 EMIS Data Collection and Verification and Validation
2.1 / EMIS Data Collection in Lower and UpperBasicSchools for 2005, and Secondary / Completed in May 2005 / Planning/EMIS with Head Teachers and Regional Education Directorates (REDs)
2.2 / Data Verification Across the Regions targeting 20% of the schools for academic y ear 2004/05 / End of June 2005 / Lead: Mr. Hydara, Planning Directorate/EMIS
2.3 / EMIS Data Entry / June –July 2005 / Coordinator: Paul K. Mendy of Planning
EMIS Staff at Central Level
Possibly contract additional persons to speed up data entry if funds can be found
2.4 / Validation of 2004-05 data for basic and secondary education / August 2005 / Lead: Paul Mendy of Planning with EMIS Staff
2.5 / Validate EMIS data for Upper Basic Education and Secondary Schools for 2003/4 / August 2005 / Lead: Paul Mendy of Planning with EMIS Staff
2.6 / Techncial Consolidation of EMIS data (e.g. running estimates, etc.) / 31 of August 2005 / Lead:Alpha
2.7 / Data is ready to produce indicators and tables / 31 of August 2005 / Mr. Hydara of Planning
3.0 2003 Population Data
3.1 / Contact Central Statistics Department and inform them of EFA Assessment Exercise with DoSE. Agree on a date to get data / As soon as possible
By 30th June 2005 / Lead: Mr. Hydara of Planning
3.2 / Get data 2003 population data from Central Statistics and upload into EMIS / TBA but as quickly as possible / Lead: Alpha from Planning
3.3 / Data capturing and manipulation using Central Statistics Department data on 2003 Population Census / By end of September 2005 / Lead: Alpha with EMIS staff
4.0 EMIS Training
4.1 / Conduct as assessment 5 EMIS staff’s skills and summarize in table form for External Consultant / By June 2005 / Lead: Mirlene and Alpha
4.2 / EMIS Consultants arrive for two-week input / September 2005
Dates to be confirmed / Lead: Planning Directorate in consultation with Consultant
4.3 / One-day workshop on Education Indicators covering what they are and their meanings for EMIS Staff / By 30th of September 2005 / Lead: Director of Planning with Mr. Hydara
4.4 / On-the-job training of EMIS Staff in EFA targets; develop dummy tables based EMIS data from 2001-2005 for review (one-day session) / By 30th of September / Lead: Director of Planning with Mr. Hydara
4.5 / 5-day training workshop on data analysis for EMIS staff should include consolidating data annually and across years and producing Excel tables necessary for reporting. (Participants would be EMIS staff, VSO Volunteer, BESPOR Consultant-Philip Bell). / By 30th of September 2005
Dates to determined with consultant / Organizer: Planning Directorate, Mr. Hydara
Facilitator: Mr. Ian Attfield, EMIS Consultant
Participants: EMIS Staff, VSO Volunteer, Philip Bell – BESPOR Consultant
4.6 / On-the-job-training on other aspects of EMIS; e.g. query design, linking tables, troubleshooting, data entry screens, consolidating data once it is entered / By 30th of September 2005
Dates to determined / Organizer: Planning Directorate, Mr. Hydara
Facilitator: Mr. Ian Attfield or Alpha Bah
5.0 Data Analysis and Reporting–National Level
5.1 / Identify the kinds of performance indicators required for the EFA Assessment Report 2001-2005. Also Identify indicators that cannot be processed / By 30th of October, 2005 / Mr. Mohammed Jallow, Director of Planning with Mr. Hydara and inputs from Mr. Sanneh of EFA Secteriat and M&E Consultants/BESPOR
5.2 / Assign staff different analysis tasks by sub-sector and/or the six targets / By 30th of October, 2005 / Lead: Mr. Hydara
5.3 / Analysis of EMIS Data in all six regions
(GER, NER, Retention Rates, Dropout Rates, Completion Rates (e.g. disaggregated by gender) / By 30th of October, 2005 / Coordination: DoSE/Planning Directorate
Data Analysis: EMIS staff and contracted consultant
Data Analysis and Report: Contracted national consultant
5.4 / All data has been anayzed and tables generated in preparation for report writing / By 30th of October, 2005 / Lead: Mr. Hydara of Planning
5.4 / 5-Day Workshop on Data Interpretation and Report Writing for Basic and Secondary Education:
Days 1 & 2: Training in Data interpretation and reporting format
Days 3 & 4: drafting report for each target
Day 5: sharing of first draft and revisions recommend
Output: Draft 1 Report of EFA Assessment of basic and secondary education / Nov. 9-13, 2005 / Organizer: Mr. Hydara of Planning
Writing Teams: There would be four teams to reflect the slix EFA targets. Two members per team (30 participants)
Region 5 Team: 5 members invited as they will do a separate analysis of CRD.
Other Participants: Participants would also include some EMIS Staff who could do analysis on the spot as well as BESPOR Consultants
5.5 / Prepare draft report to present at workshop / Nov. 16-21 / Lead: Mr. Hydara of Planning with direct support from Mr. Sanneh of EFA Secteriat
5.6 / One-day Workshop to discuss findings with stakeholders and revise / Nov. 22 / Organizer: Directorate of Planning, Mr. Hydara
5.7 / Incorporate comments from Workshop into Draft EFA Assessment Report / Nov. 23-24 / Lead: Mr. Hydara of Planning with Mr. Sanneh of EFA Secretariat
6.0 Data Analysis and Reporting – Region 5 Sub-Report
6.1 / During workshop, do a preliminary analysis of Region 5 data and a draft report. / Nov. 9-13 / Lead: Region 5 Director with his team
6.2 / Identify further kinds of analysis required in Region 5 for BESPOR baseline data during the workshop / By Nov. 13 / Lead: BESPOR Consultants with RED Team
6.3 / Carry out data and analysis and generate required tables / By 30 November / Lead: EMIS Staff
6.4 / Interpret tables and prepare draft report / Dec. 1 -4 / Lead: Region 5 Director with team and input from BESPOR Consultants
6.5 / Compile into a final report / Dec. 5-10 / Lead: Region 5 Team
6.6 / One-day Workshop to Sharing Findings at the Regional Level / January 2006 / Lead: RED 5
6.7 / Use EFA Assessment Report to draw up Strategic Plan for Region 5 and individual School Development Plans in Pilot BESPOR Schools / January 2006 / Lead: RED 5
Note / Region 5 Assessment would be updated annually and if not possible then in Year 3 or 4 / Coordination: Director of Region 5
7.0 New Data Collection for 2005-06
7.1 / Organize collection of school data and missing data for 2005/06 academic year / Dec. 2005 / Directorate of Planning: Mr. Hydara
7.2 / Collection data for 2005/06 academic year / Jan 2006 / Directorate of Planning: Mr. Hydara
7.3 / Data Entry and cleaning of 2005.06 data / Feb-March 2006 / Coordination: DoSE Planning
7.4 / Data Analysis and Interpretation to include 2000 to 2006 / April 2006 / Coordination: DoSE Planning
7.5 / Integrate findings into draft EFA Assessment Report / April-May / Coordination: DoSE Planning
7.6 / Workshop to review findings and get comments / May / Coordination: DoSE Planning
7.7 / Finalize EFA Assessment Report / May / Coordination: DoSE Planning
8.0 Dissemination of Findings
8.1 / Share the findings of the Strands 1 and 3 and progress reports on Strands 2 and 4 at a National Conference / TBA / Coordination: DoSE Planning with SQAD

Prepared by Yunus Hydara of Planning, Mirlene Andre of EMIS and Nancy Yildiz, BESPOR

Strand 2:

National Assessment Tests for Grades 3 and 5

The scale for gauging the effectiveness of the education system in The Gambia rests on pupils’ performance on examinations. With the phase out of the Grade 6 examination, there was an identified need to develop learning achievement targets (LATs) as benchmarks for improving the quality of education. This was done for Grades 1 to 5 and revised in 2005 to align with the current syllabus and textbooks in use. How closely this alignment has been achieved requires investigation. The revised LATs are now being distributed to Regional Offices, which are responsible for familiarizing teachers with them. It is these LATs that are to be used to guide continuous assessment in schools and the development of National Assessment Tests.

To date, reliable data on the quality of education has been very limited. The MLA study on Grade 4 learning achievements in 2000 is frequently cited as evidence that less than ten percent of the children tested had met the learning achievement targets set for the four core subjects (English Language, Mathematics, Social/Environmental Studies and Science). Test items were based on the LATs with no baseline data to compare what was actually covered in the classroom. Informal evidence suggested that many pupils were tested on items that they had not learned and this was a contributing factor to the high failure rates as was the high level of difficulty of the items. The study showed that pupils in private schools performed better than their colleagues in Mission and GovernmentSchools. There were also wide disparities across regions and between urban and rural schools. Similar findings were found in 2002 when WAEC carried out a study of learning achievements of Grade 3 and 5. Since then, no national assessment tests have been carried out.