Common Framework for a Literacy Survey Project
Literacy Survey
A Dissemination Guide
May 2014
Table of Contents
Preface 4
1. Introduction 5
2. Development of a National Dissemination and Communication Strategy 6
3. On Users and Uses 8
4. Planning the outline of the Report on the Survey Results 14
5. The frameworks that are developed to guide the design of the assessment 17
6. The item pools that are used to assess proficiency and to aid in interpreting the meaning of results 19
7. Estimation Procedures 20
8. Errors in Estimation 31
9. Ancillary products and services 35
Preface
The Dissemination Guide builds upon a project “Common Framework for a Literacy Survey” which was executed by the Caribbean Community (CARICOM) Secretariat under funding provided by the Inter-American Development Bank (IDB) Regional Public Goods Facility. The main aim of the project was intended to design a common approach to the measurement of literacy in countries. This common framework is built upon international methodologies and fundamentally the International Survey of Reading Skills (ISRS) that enable reliable measurement of literacy than what presently exists in the Region.
The literacy assessment is designed to measure functional literacy. In other words, it will determine an individual’s literacy level by employing a series of questions designed to demonstrate the use of their literacy skills. This involves two steps – the objective testing of an adult’s skill level and the application of a proficiency standard that defines the level of mastery achieved. The assessment measures the proficiency of respondents on three continuous literacy scales - prose, document and numeracy. In addition it will collect information on reading component skills. Component skills are thought to be the building blocks upon which the emergence of reading fluency is based. Information on the reading component skills will be collected from people at the lower end of the literacy scale only. The testing phase is preceded by a selection phase which includes the administering of a Background or Household questionnaire and post the selection of the respondent from the specific household an initial pre-assessment is undertaken through a filter test booklet to determine what type of assessment should be undertaken in the testing phase.
A consultant, Mr. Scott Murray of Canada was hired to undertake the provision of services on this project. The CARICOM Secretariat (including Regional Statistics and Human and Social Development Directorate) and the CARICOM Advisory Group on Statistics (AGS) were instrumental in the execution of the project throughout all phases. In addition there was participation by Member States and some Associate Members relative to the technical rollout of the instruments and documents.
< The paragraph that follows can be country-specific
This Dissemination Guide is aimed at providing country undertaking a Literacy Survey> with guidelines that can enable the effective provision of information about the survey to the population along the lines recommended under the IDB-funded CARICOM project.
1. Introduction
National Literacy Assessment Systems are, by definition, expensive, technically and operationally demanding and politically sensitive. Well- designed and executed Literacy Assessments can lead to dramatic and lasting improvements in the quality of education specifically it can improve the quality and equity of educational and labour market outcomes for adults.
One of the characteristics of a well designed and implemented skill assessment is that it makes a conscious effort to maximize the impact that the assessment has on policy, individual and institutional choices.
From an institutional level, the overall goal is to ensure that assessment programmes are funded over the long term, are viewed as worthwhile by key stakeholders and that they enhance the work being put in place at a policy level.
This overall goal should redound to the benefit of the public who expect to benefit from the investment of public funds via Government policy decision-makers, educational administrators, teachers and students/adult learners. Finally, the logic applies to those personnel such as the National Statistical Office and technical counterparts in the relevant ministries that are charged with the development and implementation of literacy measurement projects since they are critical to the undertaking of such an onerous assessment.
A survey is not complete until the information collected is made available to potential users in a form suitable to their needs.
The purpose of this Dissemination Guide is to set out the approach to developing dissemination and communication strategies for a Literacy Survey. The guidelines starts with this Introduction as Section 1; Section 2 will be on the Development of a National Dissemination and Communication Strategy; Section 3 on the Users and Uses of the survey results; Section 4 is on the Planning of the Outline of the Report of the Survey Results including on the production of an Administrative Report; Section 5 will present guidelines on the provision of information on the actual framework of the assessment; Sections 6 will focus on the Process of Formulating the Test Questions- as linked to the interpretation of results; Sections 7 and 8 will provide guidelines on the presentation of information to the population on Estimation Procedures and on Errors in Estimation respectively. The Dissemination Guide ends with information on Ancillary Products and Services to reach the stakeholders/target audience in Section 9.
< the structure above can also be country-specific>
2. Development of a National Dissemination and Communication Strategy
There are various tasks that are to be undertaken when developing a national dissemination and communication plan. These are as follows:
i. Prepare a detailed list of users of national assessment data
National Implementation teams need to list all of the potential users of their national assessment data.
ii. Produce an Matrix of Users, Uses and Type of Application
Having already identified users of their data national study teams must identify the level of technical competence of each user and the uses that they believe the data will be put. To do this they need to ask themselves the following questions:
o Who are my users?
o What does the level of technical competence imply for the nature of my products and service?
o What use(s) do they wish to put the assessment results?
o What do the intended uses imply for the nature of my products and services?
o Are their uses that the assessment’s technical characteristics will not support?
Key users should be ranked by importance, their ability to absorb technical information and if they are a hostile, neutral or supportive user.
The uses to which each user will want to put the data should be identified. For each use, national study teams should specify what the use implies in terms of technical characteristics of the assessment data or the actual information that will be required. Uses should be ranked on a scale of high medium and low for each user.
iii. Draft a National Report Outline
National implementation teams must:
o Lay out the basic story line for their national report.
o Draft a statement of the detailed objectives to be served by their national reports and adjust the a detailed report outline that they produced earlier so that it responds to their modified objectives.
o Identify their three most likely critics, the probable nature of their criticism and how the national report will attempt to deal with these criticisms
iv. Prepare list of Primary Products and Services to be produced
National implementation teams must produce a list of products and services that they plan to produce. For each product they must identify which uses and users the product or service is meant to respond to.
v. Draft a detailed production schedule, resource plan and budget
National implementation teams must draft a detailed production schedule and associated cost estimates for their national report and where possible will identify the actual individuals that will be responsible for each activity.
National study teams must pay particular attention to identifying:
o The assumptions they have made about the time and costs associated with key activities
o Any gaps they have in expertise and how they intend to fill them
vi. Identify the Ancillary Products
National implementation teams must list and describe the key attributes of the ancillary products that they will produce and identify associated budgets.
3. On Users and Uses
The fitness of any statistical data may be judged mainly in terms of two criteria:
o The use to which the data will be put, and,
o The - ability of the users understand the theory and methods that were used to collect the data and how these influence how they should interpret the results.
Users vary greatly in their interest and ability to understand and apply statistical information in their decision-making.
3.1 Users and their technical competence
The potential users of adult skill assessment data are numerous and extraordinarily diverse in their ability to draw inferences from complex statistical data. In many cases, the same user has a need for a range of products and services to meet a variety of uses of differing technical content. Key user groups are:
Individual learners need information to reflect upon their own progress and learning needs. They generally have very limited statistical knowledge but have strong incentives to understand.
Teachers and instructors need information to judge the progress of individual students and to adjust instruction accordingly. In the absence of specialized training, teachers and instructors generally have limited skills in using statistical information. Additionally, many of them do not have strong incentives to seek out and apply statistical information in a systematic way.
Citizens need information to judge whether the education and adults education systems are meeting their social, educational, health and economic goals and are doing so in an efficient and effective way. Most citizens have limited statistical knowledge and little interest in detail and nuance – they want and need a set of stylized facts about the performance of the systems.
Educational administrators, at several levels need information for multiple purposes:
Department heads and school principals need information to reflect upon the performance of teachers in particular domains and on the performance of specific groups of students, to adjust teaching priorities and curricula, to formulate targeted in service training for teachers, to design compensatory programs and supports, to demonstrate performance to parents and administrators higher up in the system and to lobby for additional resources. Department heads and school principals are generally comfortable with statistical data but have little time to undertake primary analysis themselves.
School board officials and their subject matter and diagnostic specialists need information for the same reasons but also to reflect on the relative performance of schools and to take action to improve same. As a group, they have mixed statistical skills – the specialists generally have advanced analytical skills and the interest and ability to use statistical information.
Educational administrators at the regional, sub-regional and national levels, including specialists in particular assessment domains and those responsible for accountability measures and reporting, need information for the same reasons described above. As a group, they have access to statistical expertise and the resources to apply them but have a need for stylized facts about the performance of their part of the system. Their key clients are politicians, including the minister(s) responsible for education and learning, teachers unions and citizens.
Unions representing the collective interests of workers, teachers, principals and school boards need information to protect their member’s interests. They can be a power agent for and against change and have access to statistical expertise but have strong incentives to use information to support their positions.
Community leaders, including local politicians, need information to assess whether the school system is producing what the community needs to meet their social, cultural and economic goals. Most community leaders have very limited quantitative skills but can usually access what they need in the community. Community leaders also need the assessment data to judge the need for, costs and benefits of higher levels investment in adult skill upgrading.
Education faculties responsible for the training of new teachers and adult instructors require information on the performance of current approaches to teacher training, curricula and instruction. As a rule the staff of education faculties have access to statistical expertise required to use assessment data.
Non-governmental agencies, research institutes and social advocates need information to monitor trends in educational outcomes and to argue for structural and policy changes. As a group these agencies have mixed ability to deal with statistical data.
Politicians and policy makers in a variety of national ministries need information for several purposes:
All ministers and ministries need information to understand the ability of their clients to use print and to adjust their communication strategies and channels accordingly;
Education ministers and educational policy makers need objective, comparative measures of skill quantity, quality, equity and salience to judge system performance, to set policy and program priorities and to allocate funds
Labour ministers and their policy makers need information to understand the quality and quantity of skill flowing from the education system with a view to identifying skill shortages;
Culture ministers and their policy makers need information to understand the relative position of linguistic and cultural minorities either in the official language(s) or minority languages;
Health ministers and their policy makers need information to understand the relationship of skills to population health;
Tax officials and their policy makers
Social development ministers and their policy makers need information to understand trends in skill levels and the role that skill plays in creating social inequity in economic, educational, social and other outcomes
Agriculture ministers and their policy makers need information to understand the connections between skill level and changes in agricultural practice
Industry ministers and their policy makers need information to monitor the supply of highly skilled workers and the quality of the skill flow leaving the post-secondary education system