assessment tool for laboratory services
(atlas) 2006
DELIVER
DELIVER, a six-year worldwide technical assistance support contract, is funded by the President’s Emergency Plan for AIDS Relief (PEPFAR) through the U.S. Agency for International Development (USAID).
Implemented by John Snow, Inc. (JSI) (contract no. HRN-C-00-00-00010-00) and subcontractors (Manoff Group, Program for Appropriate Technology in Health [PATH], and Crown Agents Consultancy, Inc.), DELIVER strengthens the supply chains of health and family planning programs in developing countries to ensure the availability of critical health products for customers. DELIVER also provides technical management of USAID’s central contraceptive management information system.
Recommended Citation
Diallo,Abdourahmane, Lea Teclemariam, Barbara Felling, Erika Ronnow, Carolyn Hart, Wendy Nicodemus, and Lisa Hare. 2006. Assessment Tool for Laboratory Services (ATLAS) 2006. Arlington, Va.: DELIVER, for the U.S. Agency for International Development.
Second Printing: USAID | DELIVER PROJECT, Task Order 1. 2008.
Abstract
The Assessment Tool for Laboratory Services (ATLAS) 2006 is a data gathering tool developed by the DELIVER project to assess laboratory services and logistics. The ATLAS is a diagnostic and monitoring tool that can be used as a baseline survey to complete an annual assessment or as an integral part of the work planning process. The ATLAS is primarily a quantitative tool with a small sample qualitative facility survey of available commodities and equipment. The information collected by using the ATLAS is analyzed to identify issues and opportunities and to outline further assessment and/or appropriate interventions.
The ATLAS is used to analyze the entire laboratory system. It includes three questionnaires: central administrative level, intermediate administrative level, and the facility (laboratory) level.
Assessments using the ATLAS can be conducted and analyzed in successive years. The results can contribute to the monitoring, improvement, and sustainability of laboratory performance and provide critical non-logistics data that can identify a country’s laboratory systems’ strengths and weaknesses.
DELIVER thanks the AIDS/HIV Integrated Model District Program (AIM) for its guidance on the infrastructure and inspection sections of the tool.
USAID | DELIVER PROJECT
John Snow, Inc.
1616 Fort Myer Drive, 11th Floor
Arlington, VA22209USA
Phone: 703-528-7474
Fax: 703-528-7480
E-mail:
Internet: deliver.jsi.com
Contents
Acronyms
Acknowledgments
Assessment Tool for Laboratory Services (ATLAS) User’s Guide
Background and Intended Use......
Benefits...... 1
Overall Process
Planning for the ATLAS
Adapting the ATLAS
Data Encoding and Analysis
Analysis of the Collected Information
Conclusion
ATLAS—Central Administrative Level 2005...... A-
Central Administrative Level Questionnaire...... A-
I.Organization...... A-
II.Policy...... A-
III.Forecasting and Procurement...... A-
IV.Financing...... A-
V.Storage and Distribution...... A-
VI.Inventory Control System...... A-
VII.Laboratory Services Management Information System...... A-
VIII.Supervision...... A-
IX.General Questions...... A-
ATLAS—Intermediate Administrative Level 2005...... B-
Intermediate Administrative Level Questionnaire: General Information...... B-
I.Organization...... B-
II.Policy...... B-
III.Forecasting and Procurement...... B-
IV.Financing...... B-
V.Storage and Distribution...... B-
VI.Inventory Control System...... B-
VII.Laboratory Services Management Information System...... B-
VIII.Supervision...... B-
IX.General Questions...... B-
ATLAS—Facility Level 2005...... C-
Facility Level Questionnaire: General Information...... C-
I.National Guidelines and Protocols...... C-
II.Laboratory Personnel...... C-
III.Laboratory Testing Services...... C-
IV.Quality Assurance Tests...... C-
V.Equipment Availability and Maintenance...... C-
VI.Laboratory Supplies Logistics...... C-
VII.Laboratory Infrastructure...... C-
Interviewer’s Guide to Inspecting the Laboratory Area...... C-
Bibliography...... D-
Assessment Tool for Laboratory Services (ATLAS) 2006 1
Acronyms
AFBacid-fast bacilli
AIDSacquired immunodeficiency syndrome
AIMAIDS/HIV Integrated Model District Program
AMREFAfrican Medical Research Foundation
ASTaspartate aminotransferase
CDCCenters for Disease Control
CSFcerebrospinal fluid
DKdon’t know
ELISAenzyme-linked immunosorbent assay
GOTglutamic oxolocetic transaminase
Hbhemoglobin
HChealth center
HIVhuman immunodeficiency virus
JSIJohn Snow, Inc.
KOHpotassium hydroxide
LIATLogistics Indicators Assessment Tool
LMISlogistics management information system
LSATLogistics System Assessment Tool
MOFMinistry of Finance
MOHMinistry of Health
p24protein 24
PCRpolymerase chain reaction
pHpotential hydrogen
RNAribonucleic acid
RPRrapid plasma reagin
RTreverse transcriptase
SGOTserum glutamic oxaloacetic transaminase
SGPTserum glutamic pyruvic transaminase
SOPstandard operating procedure
STIsexually transmitted infection
TBtuberculosis
TPHAtreponema pallidum hemagglutination assay
TSItriple sugar iron
USAIDU.S. Agency for International Development
VDRLvenereal disease research laboratory
ZNZiehl-Neelson
Assessment Tool for Laboratory Services (ATLAS) 2006 1
Acknowledgments
This paper, which is a component of the CD Resources for Managing the Laboratory Supply Chain, is dedicated to people around the world living with HIV/AIDS and to the many individuals from communities, nongovernmental organizations (NGOs), faith-based organizations, Ministries of Health, and other organizations who have consistently fought for access to antiretroviral drugs and other commodities required to provide HIV/AIDS services. The CD is also dedicated to friends and counterparts who have worked with DELIVER, the Family Planning Logistics Management project, and John Snow, Inc., since 1986 and to the thousands of committed professionals in Ministries of Health and NGOs who work daily to supply their customers and programs with essential public health commodities. Although the resources provide a focus on specific HIV/AIDS and laboratory commodities, we recognize that comprehensive HIV/AIDS and laboratory programs require the supply chain to manage and deliver a broad range of several hundred public health commodities.
The U.S. Agency for International Development (USAID) contracts funded the technical assistance, in-country projects, and research that produced the experience and lessons contained in the Resources. We are deeply grateful to the team of professionals in the Commodity Security and Logistics Division in the Office of Population and Reproductive Health of the USAID Global Health Bureau’s Center for Population, Health, and Nutrition—especially Mark Rilling and Sharmila Raj—for their encouragement and advice and their commitment to improving HIV/AIDS laboratory and public health programs through logistics.
Numerous people helped write this and other documents that constitute the Resources. Sincere thanks go to the core team of dedicated technical staff who developed and wrote the components—namely, Abdourahmane Diallo, Barbara Felling, Wendy Nicodemus,Colleen McLaughlin, Lea Teclemariam, Ronald Brown, Yasmin Chandani, Claudia Allers, Gregory Roche, Erika Ronnow, Aoua Diarra, Jane Feinberg, Carmit Keddem, Lisa Hare, Carolyn Hart, Naomi Printz, Paula Nersesian, Meba Kagone, Kim Peacock, and Motomoke Eomba. Special thanks go to Edward Wilson, Nancy Cylke, Richard Owens, Johnnie Amenyah, Greg Miles, Jennifer Antilla, and Lisa Cohan for their significant contributions and valuable support.
Field examples and data were generously contributed by Hannington Ahenda, Steve Kinzett, Steve Wilbur, Gaspard Guma, Catherine Lwenya, Moses Muwonge, Walter Proper,and Jayne Waweru. The lessons drawn from DELIVER’s experience in managing HIV/AIDS and laboratory supply chains would not have been possible without these valuable contributions.
The DELIVER Communications Group edited, designed, and produced the Resources. Their patience, persistence, insight, and support are much appreciated. In particular, appreciation goes to Heather Davis, communications manager; Pat Shawkey, publications manager; Pat Spellman, editor; Gus Osorio, art director; Kathy Strauss, Paula Lancaster, and Susan Westrate, graphic designers; Erin Broekhuysen, communications strategist; Delphi Lee, JSI assistant webmaster; José Padua, DELIVER web manager; Madeline McCaul, communications officer; Jessica Philie, publications coordinator; and Jacqueline Purtell, communications coordinator.
Assessment Tool for Laboratory Services (ATLAS) 2006 1
Assessment Tool for Laboratory Services (ATLAS) User’s Guide
Background and Intended Use
The Assessment Tool for Laboratory Services (ATLAS) is a data gathering tool developed by the DELIVER project to assess laboratory services and logistics. The ATLAS, a diagnostic and monitoring tool, can be used for a baseline survey to complete an annual assessment or as an integral part of the work planning process.The information collected by using the ATLAS is analyzed to identify issues and opportunities and to outline further assessment and/or appropriate interventions.
Assessments using the ATLAS can be conducted and analyzed in successive years.The results can contribute to the monitoring, improving, and sustaining of laboratory performance and provide critical non-logistics data that identify a country’s laboratory systems’ strengths and weaknesses.
Benefits
The ATLAS can—
- Provide
stakeholders with a comprehensive view of all aspects of the laboratory services
a snapshot of testing capabilities and commodity availability at laboratories throughout the system
input for work planning.
- Be used
as a diagnostic tool to identify issues and opportunities for each individual laboratory in a given country
by country personnel as a monitoring tool (to learn and continually improve performance).
- Raise collective awareness and ownership of laboratory services performance and goals for improvement.
Overall Process
Assessment period/cycle
The ATLAS can be conducted at anytime as a baseline assessment or at a time agreed upon within selected countries. Ideally, the ATLAS should be conducted within the three-month period prior to work planning or strategic planning exercises.
Data collection
The ATLAS contains three questionnaires:
- central administrative level
- intermediate administrative level (if applicable)
- facility (laboratory) level.
The three questionnaires need to be adapted for the in-country system. The intermediate administrative level questionnaire focuses on decentralized logistics functions. In a highly decentralized system, this questionnaire will need to be adapted. See the sectionAdapting the ATLAS.
This structure allows different methods to be used for each questionnaire. In general, three methods are recommended for data collection:
- Discussion groups can be conducted at the central level with officials at that level only (using the central administrative level questionnaire) or with representatives of both the central and intermediate levels (using both central and intermediate administrative level questionnaires). Discussion groups can also be conducted separately at the intermediate level (using the intermediate administrative level questionnaire).
- The ATLAS can be used as a guidewhen conducting key informant interviews at the central and intermediate levels. If key informant interviews are used, it may be necessary to interview multiple people with varying degrees of knowledge of the system to complete the questionnaire. All key informant interviews should be consolidated, and the answers should be reconciled.
- Field visits are the preferred method to use with the facility level assessment. These visits are necessary to evaluate the infrastructure, storage conditions, and the availability and status of equipment and supplies.
To have a complete assessment, it is highly recommended that the ATLASbe used for a group discussion at the central level (and intermediate level, if applicable) and for field visits at the facility level.
Data analysis and work plan development should take place immediately following data collection. To develop and prioritize a set of objectives and interventions that are designed to address issues raised through the assessment, this process should include a thorough review of strengths andweaknesses.
Learning and performance improvement
The ATLAS provides a comprehensive overview, particularly at the facility level. The baseline data it provides can facilitate performance and process improvement.However, the repeat use of the ATLAS depends upon the outcomes after the interventions are implemented. It is preferable to wait for interventions to take place before repeating the ATLAS.
Planning for the ATLAS
Preparatory research
Some aspects of the ATLAS should be researched in advance of the group discussion and field visits. The general levels of the system should be identified (i.e., whether the country uses regional, zonal, or provincial). The evaluation team should also know whether some key functions are decentralized;in many countries, key policy and logistics decisions are made at an intermediate administrative level (e.g., the district or the regional office). In this case, the intermediate administrative level questionnaire will need to be adapted to reflect the different responsibilities at each level. See the sectionAdapting the ATLAS for more information.
Additionally, the evaluation team should try (if possible) to collect all policy and guideline documents prior to the interviews. These documents can help guide the discussion.
Choosing the data collection method
Talk with the program managers or country counterparts and agree upon the approach to be used.
Small discussion groups are preferable for the central and intermediate level questionnaires. These groups may require a few hours to gain the breadth and depth of data required and to provide adequate opportunity for full participation.
If the assessment is intended to develop strategies for systemic interventions (e.g., design a logistics system for laboratory supplies), field visits to sample facilities should be included and planned. Before drawing the sample, all parties should agree to the criteria for selecting the facilities. A sampling frame that includes the complete list of facilities to be assessed will be required. The list should be stratified by region/province, facility type, and urban or rural area, as appropriate. Ideally, the sample size should be allocated proportionally within each stratum (i.e., region/province, facility type, urban or rural, etc.). A stratified sampling will provide more precision than does a random sampling. The sample size should be determined on the basis of standard statistical formulas.
In case of resource constraints, visit a default number of a minimum of 100 facilities.[1] Fewer facilities may be considered for cross-sectional rapid assessments or qualitative studies but are not ideal to measure (statistically) significant changes over time. In some cases, to avoid extensive traveling, two-stage sampling may also be considered. In the first-stage, the administrative areas (e.g., region, province, district, etc.) are randomly selected, followed by selection of the facilities during the second stage.
If the plan is to provide information for the development and implementation of interventions specifically for each facility, then a countrywide assessment plan should be developed and a visit to each laboratory facility considered for the intervention.
A combination of discussion groups(and key informant interviews, if appropriate) for the central and intermediate levels questionnaires, and field visits for the facility level questionnaire, are the preferred approach to be used for conducting an ATLAS.
After the data collection is completed, a joint discussion group that includes representatives from all levels and all programs (e.g., laboratory services, tuberculosis, and leprosy control, HIV/AIDS, malaria, etc.) should be organized to reconcile findings and develop a work plan.
Number andQualifications of Data Collectors
It is important that the same data collectors are available for both the group sessions and field visits. Because many laboratories have limited space and no facilities for visitors, it is important to give careful consideration to the number and the skill sets of the data collectors. The evaluation teams should usually not exceed four members during a field visit. Each evaluation team should include at least one interviewer with laboratory experience who can understand and interpret the terminology specific to laboratories and at least one interviewer with experience assessing and designing logistics systems.
Selecting interviewees
- Central level
To obtain accurate data about the functioning of each aspect of laboratory services, it is very important to have the right set of people.
At the central level, it is critical to identify the division or unit that is responsible for managing laboratory services in a specific country. Representatives from the senior management of that unit are the most appropriate interviewees for this level. In addition, representatives from programs that require testing services (e.g., HIV/AIDS, TB, STI, malaria, etc.), the division responsible for forecasting/procurement (e.g., Ministry of Finance or pharmacy division at the Ministry of Health [MOH]), and the senior stores officer from the supplying facilities (such as the central medical stores) should be part of the central level questionnaire.
- Intermediate level
As explained earlier, the intermediate level questionnaire collects data on management level issues, similar to the central level, but, specifically, for a decentralized setting. Members of the district or regional level management team are usuallyappropriate. These management teams include, among others, district or regional medical officers in charge, head financial officers, chief pharmacists, chief laboratory technologists, medical superintendents, and, in some cases, representatives from the community.
- Facility level
The laboratory technologist in charge is the correct person to interview. In her or his absence, the most senior laboratory technologist (or technician) canbe interviewed. Any of the technical staff in the laboratory should be able to answer most of the questions in the facility level section of the tool. It is important to remember that this step includes an extensive inspection of the laboratory supplies storage area, infrastructure, and equipment. Therefore, a knowledgeable technician should be interviewed.
Table 1 shows the required knowledge areas for the interviewees, by level.
Table 1: Required Knowledge Areas of Participants, by Level
Knowledge Area / Central Level / Intermediate Level / Facility LevelNational Laboratory System Organization / X / X
National Policies / X / X / X
Forecasting and Procurement / X / X
Financing / X / X
Storage and Distribution / X / X / X
Inventory Management / X / X / X
Laboratory Management Information System / X / X / X
Laboratory LMIS / X / X / X
Supervision / X / X / X
Laboratory Personnel / X / X / X
Laboratory Testing Services / X
Testing for Quality Assurance / X / X / X
Equipment Availability and Maintenance / X
Supply Availability / X
Laboratory Infrastructure / X
Planning field visits
Field visits should be made after the discussion sessions/interviews with the central level because the facility level tool will need to be customized for this program or country. It is recommended that the interviewers make field visitswith appropriate stakeholders, if possible. All field visits should be scheduled ahead of time to ensure that the appropriate staff member will be available.