Data Use Net Discussion Series

Using Data to Improve Health Programs: Strategies to increase data use at the service delivery point

June 13-17, 2011

Moderators: Tara Nutley, MEASURE Evaluation & Astou Coly, URC-CHS

Context

Around the world, data are collected at health facilities about the populations they serve, their health needs and the services provided to meet those needs. These data are used to populate reports that are required by the varied national health programs. Often, once these data are sent to the higher level in the health system, they are not considered or used by the facilities themselves or their district/regional management to make decisions about future service delivery. While providers may use data for individual patient management, health managers and providers rarely analyze the data they collect to monitor service delivery trends or to assess problems and identify new strategies for improving health services. As a result, many health systems fail to fully link evidence to decisions and suffer from a decreased ability to respond to the priority needs of the communities they serve.

More strategic and effective use of routine health data can inform the decision-making processes at subnational levels. These data can be used to guide program design, management, and service provision. The effectiveness of health programs throughout the world is dependent on the ability of program managers and providers to identify needs in the communities they serve and to understand the extent to which their programs address these needs.

Discussion

The Data Use Net (DUN) discussion on Using Data to Improve Health Programs: Strategies to increase data use at the service delivery point ran from June 13 through June 17, 2011. Each day DUN moderators posted a question that was relevant to the overall discussion topic. Responses to the questions were shared daily with the DUN on-line community. The questions included:

  1. In what context have you facilitated data use at the service delivery-level?

Please comment on the specific steps/activities implemented to improve data use and the outcome of improving data use.

  1. Did you use any concrete tools to facilitate data use at the facility level?

Please provide detail on the context where you used the tool and a copy of the tool.

  1. What have been the enabling factors which have led to increased use of data in program monitoring and decision making? What barriers have you encountered?
  2. What lessons have you learned when facilitating data use?

Please comment on specific skills and/or the environment needed to facilitate data use at the facility level.

Also comment on how you have shared your data use lessons learned with others.

  1. How do we encourage and sustain improvements in data-informed decision making at the facility level?

Throughout the week, DUN members and moderators shared materials including tools and resources relevant to the discussion topic. All resources are included in Appendix 1. The daily discussions are listed by day and are included in Appendix 2. A summary of each day’s discussion is included below.

Day 1 summary of responses to the question - How have you facilitated data use at the service delivery level?

  • Focused on defining clear program results (objectives) and mapped program activities to results.
  • Highlighted the link between data and good decision making.
  • Shared facility level data with administration leadership (district –level) to identify low- performing facilities. Leadership used the data to advocate for programmatic improvements.
  • Developed data feedback tools that displayed the validated monthly reported achievements in different program areas and program quality performance indicators for selected indicators per program area.
  • Engaged data users (program managers and providers) and data producers (M&E specialists) in the M&E and program improvement processes.
  • Implemented a Plan-Do-Study-Act Cycle that allowed communities to collect their own data, plot and review it on a monthly (or twice per month) basis and act based on what they were learning about the services they were delivering.
  • Implemented standardized M&E guidelines, an open-access MIS system and data quality assurance systems to ensure quality M&E and data use.

Day 2 summary of responses to the question - Did you use any concrete tools to facilitate data use at the facility level?

Tools shared by DUN members:

  • Documentation Journal for QI Teams
  • Guidance for Analyzing Quality Improvement Data Using Time Series Charts
  • HCI QI Team Database Template
  • The Run Chart: A simple analytical tool for learning from variation in Healthcare Processes
  • The Seven Steps to Use Routine Information to IMPROVE HIV/AIDS Programs
  • Family Planning District & Facility Tools for Data Management
  • IQ Chart
  • DQA Checklist

Day 3 summary of responses to the questions - What have been the enabling factors which have led to increased use of data in program monitoring and decision making? What barriers have you encountered?

Enabling factors

  • Regular data review (monthly and/or quarterly) in collaboration with data users and data producers. Action taken based on review.
  • Program staff and management are also trained in M&E
  • Availability of appropriate tools, guidelines and standard operating procedures (M&E plans).
  • Regular training of facility staff on M&E and MIS systems
  • Willingness of local leaders to use data – developed through M&E training , training on data use and analysis
  • Institutionalization of quality improvement committees
  • Performance Based Financing scheme at provincial, district and health facility and use of the Quality Improvement Tool to develop
  • Movement away from vertical, top-down, monolithic and unchallenged data systems – to – the promotion of the concept of ‘data democracy’, whereby those who collect the data have a say in the kind of data that suits their information needs.
  • Decentralization of health systems.
  • Increased accountability requirements of donors and MOH partners.

Barriers

  • Political officials disagree with negative M&E results.
  • Data quality – discrepancies between facility and state-level data.
  • Many non-computerized data management systems in existence. These are very time consuming to maintain and takes away from time that could be spend analyzing and using data for program improvement
  • When computerized MIS systems to exist many still rely on paper based systems.
  • Lack of skills and competencies at the facility level in using computers and software. Also most IT instructional material is in English and in some countries English is not spoken.
  • Lack of reward systems that distinguish between good and bad performers, users and non-users of information.

Day 4 summary of responses - What lessons have you learned when facilitating data use at the service delivery level?

  • Highlight the importance of data in a way that relates it to an individual’s day to day life or other familiar situation.
  • Use simple techniques and tools for data analysis.
  • Prioritize indicators. Don’t try to analyze them all at once. Slowly build positive experiences using data – this promotes more use.
  • Encourage a culture of data sharing among the wider program/technical team members.
  • Encourage all responsible people to do the data analysis and use at their level
  • Foster leadership to create and lead the culture of data use.
  • Increase the understanding of the MOH of the need to implement information systems that enable use of data (data democracy).
  • Promote/foster decentralized decision-making environments. Accountability encourages data use.
  • Persistent supportive supervision to build capacity of national and regional appointed MOH HMIS officers to undertake DQA.
  • Availability of data doesn’t necessarily lead to its use, nor does the availability of better and better quality of data.
  • To encourage use of data, you need to fully understand the role of the manager and skilfully guide what decisions to make using data, and once decision is made you give feedback on positive change using data.
  • If there is no incentive to use the data it won’t be used.

Day 5 summary of responses - How do we encourage and sustain improvements in data-informed decision making at the facility level?

  • Develop a culture and internalize a system of data use for planning. This includes: a clear M&E plan, relevant job-aids/tools, simplified indicators with clear definitions, regular monitoring and coaching, a culture of appreciation and acknowledgement and sharing best practices across the sites and in external forum.
  • Create competition among service providers at the facility level by using Annual Performance Prices strategy i.e. at the end of every year there should be prices given to the bests service provider and Local Government Authority that is making use of data for decision making and the process should happen across all levels.
  • Regular training and retraining in M&E and data use.
  • Always encourage the use of data before decision making at the health facility level. For example before purchasing facility related consumables, previous records of consumable consumption should be consulted.
  • Strengthen links between the facility and state/district top officials (via supervision visits) to encourage accountability among service providers.
  • Institutionalize interaction of data users and data producers through the use of easy analysis tools (such as Small Test of Change) and strong data feedback mechanisms.

Appendix 1 – Resources

Background reading

  1. A Modern Paradigm for Improving Healthcare Quality
  2. Seven Steps to Use Routing Information to Improve HIV/AIDS Programs: A Guide for HIV/AIDS Program Managers and Providers
  3. RHINO Discussion Forum Report – RHIS and Service Quality
  4. Guidance for Analyzing Quality Improvement Data Using Time Series Charts

Tools submitted by DUN Members

  1. Routine Data Quality Assessment: As Approach for Data Verification
  2. Unique Identification Numbering
  3. Documentation Journal for Quality Improvement Teams
  4. Quality Improvement Database Template
  5. The Run Chart: A Simple Analytical Tool for Learning from Variation in Healthcare Processes
  6. District Family Planning Data Management Templates
  7. International Quality Clinical HIV/AIDS Registry Tool (IQ Chart)
  8. Results Framework
  9. Data Quality Assurance Checklist (DQA)

Other resources

Community-level Quality Improvement

Data Demand and Use Conceptual Framework

Run Chart tool

Introduction to Basic Data Analysis and Interpretation for Health Programs: A Training Tool Kit

Community Based M&E Systems

Vital Registration and Verbal Autopsy Tools

Data Quality Tools

PRISM Tools

Assessment of Data Use Constraints Version 2: Facility visit

of data use constraints

Data Demand and Use: Using Data to Improve Service Delivery. Training Tool Kit for Pre-service Nursing Education

Performance based incentives

Electronic Medical Records

Partnership Defined Quality (PDQ) methodology

Lessons from developing nations on improving health care

Building the Bridge from Human Resources Data to Effective Decisions: Ten Pillars of Successful Data-Driven Decision-Making.

Framework for Linking Data with Action

for linking data with action

Routine Health Information Network


Appendix 2 Daily Discussions

Day 1 Discussion posts

Day 1 Question -How have you facilitated data use at the service delivery level?

Name: Humberto Muquingue

Organizational affiliation: Jhpiego, Mozambique

Country of residence: Mozambique

The context where you facilitated data use:

As part of support to health systems strengthening provided by Jhpiego Mozambique to the Ministry of Health, we facilitated the development of course materials and the facilitation of integrated courses on planning, information systems and monitoring and evaluation, reaching all 11 provinces of Mozambique. Participants were more than 900 health workers, including health managers and information officers, working at district, province and central levels.

The specific steps/activities implemented:

1. Shared the need to have clear program results (or objectives) in order to design appropriate activities that will lead to results being achieved; also worked on problem trees to enable health professionals to identify priorities.

2. Used the results framework as the main tool to map results and link them to activities (action plan) and to ways of measuring progress and success (M&E plan)

3. Explored the main purposes of each of the course components and how they rely on each other: planning requires data that is provided by information systems, which in turn aim at feeding M&E mechanisms with appropriate accounts on how planned activities are running (or not).

The outcome of improving data use:

Poor post-training follow-up has constrained the assessment of any improvement in data usage. There are, however, subjective impressions that higher reporting of some health indicators may reflect improved awareness of the need of reliable data collection.

Moderator comments:

We often see M&E skill building efforts focused on M&E officers. As the health professionals who supports the M&E infrastructure this is necessary. However, we often don’t prioritize M&E skill building among other health professionals. Your approach to train program managers as well as M&E professionals is a good one. Program managers are the primary data users. They are the decision makers when it comes to revising, improving and defining health services. They also need the M&E knowledge to understand how to create program objectives, how to define and calculate indicators, how to collect data as well as how to clean and store M&E data. They also need the skills to analyze, interpret, communicate and use the data for program improvement. This is best done in collaboration with the data producers – the M&E professionals. When data users and data producers work together, they become aware of available data sources and knowledgeable of the quality of these data. They have the opportunity to address barriers to data use and improve the sharing of data resources. They can also discuss and identify key programmatic questions and concerns and link these questions to the data available in their settings. They can jointly analyze and interpret data to answer programmatic questions. In this context, ownership of data is built so that when data-informed decisions are made the necessary buy-in exists to move the decision forward. By linking data users and data producers in the M&E process, the information cycle is strengthened and the value of data to program improvement becomes clear.

Humberto, training over 200 health workers in all 11 provinces provides an excellent opportunity to promote data use in Mozambique. I am wondering if you know the reasons for the poor post-training follow up. Also,do you have any specific tools you used to do this? We note that you used a problem tree process as well as a results framework. Could you share these as part of the day 2 discussion?

Name: Akeem Ganiyu

Organization: T-SHIP Sokoto

Country of residence: Nigeria

The IMMbasic project, began in October 2006 to strengthen Routine Immunization (RI) in northern Nigeria. The project worked with international partners and relevant government agencies at both the national level and in two northern states, Bauchi and Sokoto. The project aimed at strengthening both human resource and system capacities for improved delivery of RI services following NigeriaREW (Reaching Every Ward) guidelines, which was adapted from RED (Reaching Every District) approach for improving RI. The project had a short lifespan, only two-and-one-half years. Sokoto states have 23 Local Govt. Areas (LGAs) with 244 wards. In April 2008, the RI coverage for 1st quarter i.e Jan-March were significantly low in some LGAs, most especially Gwadabawa LGA in particular.

Basically, one of the issues met on the ground was that the LGAs and Facilities did not use data for decision making, they only make and pass data to next level. So what we did as a project was to use a advocacy visit strategy. The visit was to target the LGA Chairmen because he/she was the leader of their LGA and could be held responsible for what happened in their LGAs. Our basis for advocacy was to use DPT3 coverage ranked chart by performance by LGAs which revealed the status of LGAs DPT 3 coverage. The advocacy team was constituted of SMOH rep. and IMMbasic staffs. We first visited the poor performing LGA and showed the LGA chairman the RI coverage chart in which his LGA had 3% coverage of DPT3 i.e RI services.

The chairman was surprised because he was not aware of the health related issue of his LGA. In our presence he summoned the staff of PHC department for a meeting. The output was that he helpedthe department to resolve most of the issues such as Cold-chain and logistic issues that he did not even know about.The outcome was that before the end of the year the coverage of that particular LGA raised to 69% from 3% in first quarter. The same strategy was applied to all LGAs. The main outcome was that all RI focal persons in each LGAs were given feedback to their various LGAs Chairmen. This has been helpful in term of RI services in the state. I hope this is helpful.