Strengthening State Capacity in Health with Administrative Data in Sierra Leone
Abstract
The government of Sierra Leone’s post-Ebola health system strengthening efforts have been compromised by a lack of accurate data on the services that are provided and how often. Researchers measured the accuracy of health records in clinics by comparing actual immunizations given to children to administrative clinic records. The immunizations for 72 and 79 percent of children surveyed were correctly recorded across clinics in all districts in the newly introduced project register and the existing Expanded Program on Immunization (EPI) register, respectively.
Policy Issue
Accurate and timely data tracking healthcare information is critical for governments to effectively provide healthcare services to their citizens and identify weaknesses in service provision. This is especially true in West Africa, which in 2014 endured the largest outbreak of the Ebola virus in recent history. Weak monitoring systems and poor public health infrastructure contributed to the rapid spread of the virus in the region.
Administrative data has the potential to provide comprehensive and frequent information on utilization from health facilities at a lower cost than survey data. Two ways of assessing a health care system using administrative data are monitoring how often people use it, and what type of care which they receive. States use this information to make informed spending decisions and identify weaknesses in the system, however untimely or inaccurate data hinders these efforts. One way to strengthen health data systems is to test the accuracy of the data which is being reported and retrain health workers if necessary.
Context of the Evaluation
During the Ebola crisis in 2014, Sierra Leone had 14,000 cases of the Ebola virus, resulting in roughly 4,000 deaths. Since then, a lack of accurate data on which services are provided and how often has compromised the Government of Sierra Leone’s post-Ebola health system strengthening efforts.
Sierra Leone’s Ministry of Health and Sanitation Services (MoHS) determines how many citizens are using the health system primarily from summary reporting by its 1,239 peripheral health units. These are small health clinics that provide primary health care and focus on child and maternal health services. Clinic staff working in these health clinics use different registers, tallies, and summary sheets to record health visits like immunizations and prenatal care visits. At the end of the month, this data is aggregated and sent to district-level officials, which are then consolidated at the district. For immunizations, district officials consolidate the immunization numbers and other health indicators from all clinics for their district and report them to the Program Manager for Child Health, who uses the data to compute immunization coverage rates, assess current utilization of services, and forecast vaccine demand for each district.
The district level management teams are supposed to visit the clinics quarterly to physically check the consistency and accuracy of the data, but this policy is not fully adhered to by all districts. District officials cited funding challenges, lack of understanding of the importance of data, lack of computer literacy, and a lack of human resources as primary constraints on regular data monitoring.
Details of the Intervention
As part of a separate study, researchers sought to measure the accuracy of health records in clinics in Sierra Leone by comparing actual immunizations received by children to clinic records from 120 randomly selected clinics in Bombali, Kambia, Tonkolili and Western Area Rural districts from July 2016 to March 2018.
To measure the accuracy of clinic health records, researchers provided clinics with a child “project” register to be used to record immunization visits for those that did not have the official Expanded Program on Immunization (EPI) clinic register at the start of the project. The research team designed the child register to encourage clinics to accurately record immunization visits on the spot instead of filling the register at a later point (from memory) or prefilling entries before the actual immunization took place. Researchers trained all clinic staff involved in immunization services in proper record keeping and provided feedback on irregularities during monitoring visits, both for the regular EPI clinic register and the new project register. Researchers further instituted a small financial incentive in the form of airtime (up to SLL 20,000) conditional on proper implementation of immunization services, the social incentive program, and record keeping.
To measure actual immunizations received by children, researchers surveyed the caregivers of a sample of 14,062 children under the age of two in the areas surrounding the clinics on their children’s immunizations. Surveyors asked parents about the immunizations received and recorded the immunizations marked on the child’s growth card provided at the time of the visit. Researchers then compared these children’s actual immunization records to what was reported in the project registers to measure the accuracy of recorded immunization visits and clinic staff reporting behaviors. In early 2018, researchers conducted a follow-up survey on the caregivers of a sample of 1,343 children. They compared these children’s actual immunization records to the number of recorded immunization visits from the EPI register to measure whether reporting behavior differs for government forms (i.e., the EPI register) compared to forms instituted by implementing partners as part of a specific program (i.e., the project register).
Results and Policy Lessons
Analysis of clinic register data showed that average rates of accurate reporting were fairly high, at 72 percent for project register data and 79 percent for EPI register data, conditional on clinic staff being (re)trained in data entry, regular monitoring of and feedback on data entries, and a small reward linked to data accuracy. However, there was substantial variation, with some clinics exhibiting perfect data while others showed large discrepancies.
For most clinics, reporting errors were random, but for a smaller but sizable share of clinics, researchers observed that clinic staff were deliberately inflating the number of immunizations reported on regular clinic registers. The newly introduced project register was able to mitigate such behavior, reducing overreports by half. On the flipside, the EPI register led to almost no underreporting, while the newly introduced project register led to a 12 percentage point increase in the number of missed vaccination entries. Researchers observed two behaviors among clinic staff that may explain this increase: first of all, nurses who administered vaccinations did not record immunizations on the spot, but instead filled registers later. It was harder to fill out the project registers afterwards, while this was fairly easy with regular clinic registers. Secondly, filling out registers was an effortful task for clinic staff, which suggests that while it can be hard to motivate clinic staff to fill out additional registers, a register that is designed in a way that makes it more difficult to pre-fill or inflate data can mitigate overreporting.
The reporting discrepancies were highest for the summary forms that clinic staff forwarded to the districts. The errors went in both directions, with clinic staff reporting too high and too low numbers compared to the register data. Reassuringly, there was high consistency in the data that district officials reported to the Ministry of Health and Sanitation, with some variation in quality across districts.
Based on the observed reporting behavior, the research team has three recommendations:
1. To mitigate biases in reporting, clinic registers could be redesigned to make it difficult for nurses to fill vaccinations children have not yet received.
2. Implementation partners should avoid introducing additional registers or forms into existing health programs. Clinic staff have limited capacity and/or motivation to fill in additional forms. Partners and the MoHS should therefore work closely together to avoid multiplication of data that is collected at clinics and coordinate to improve the existing forms for those to generate the necessary information.
3. Regular clinic monitoring and (re)training of clinic staff that focus on data entry procedures (e.g. recording vaccinations as they are given out instead of later) could improve accuracy.