A Phone Survey without Phones? Lessons in How to Reach Those without Access to Mobile Phones.

A Phone Survey without Phones? Lessons in How to Reach Those without Access to Mobile Phones.

Template G Content Blocks
Sub Editor
Editor’s Note: this is the second in a series of posts about adapting an ongoing policy project about digital payments with Bangladesh’s government to the current COVID-19 crisis.
 
In the first blog post of this series published in May, IPA was newly transitioning to phone surveys during the pandemic. IPA has been working with A2i, a branch of the Bangladesh government which helps to ensure access to public services, to help monitor the digitization process of government social support payments to three vulnerable populations: elderly pensioners, widows, and disabled people by transitioning from cash payments to deposits into newly opened bank accounts. 
 
The aim of the partnership is to identify challenges beneficiaries may experience, to quickly address them and improve the process. In that post, we raised questions about the effectiveness of conducting a phone survey during a pandemic, with a focus on reaching people (particularly women) who are unlikely to own their own mobile phone. We completed the first of four rounds of the survey a few months ago and have some interesting answers and lessons learned to share, particularly about methodology.
 
When transitioning the survey from in-person to a phone-based survey, we focused our attention on three methodology concerns:
  1. Ensuring a representative sample of beneficiaries regardless of access to a phone;
  2. Establishing a private and confidential place for beneficiaries to answer the survey; and
  3. Getting participants to complete the 30-minute phone survey to the best of their ability.

How do we call phoneless beneficiaries?

Before we began reaching out to a sample of beneficiaries, we analyzed the demographic dataset provided by A2i to ensure this sample was representative of their whole population of beneficiaries. For this survey, we were particularly focused on reaching women, as they are less likely to own phones, since one of our key research questions to monitor includes how the digitization process would work for those who do not own or have access to a mobile phone. In-person, door-to-door sampling makes it easier to find those without phones, but obviously that was not an option. But reviewing the dataset, we realized that 20 percent of phone numbers were duplicates; an unusually large number that would seem to indicate a data entry problem. As we started calling these numbers to investigate, we discovered that they were usually legitimate phone numbers for Ward members (local officials) in beneficiaries’ communities. Many beneficiaries who did not have their own phone number had named their Ward member as their contact person, and they were typically able to put us in touch with the person we were looking for.
 
Lesson learned: New methods might mean new kinds of patterns in the data - allow time to investigate and adjust. 

How can we make respondents comfortable when asking about sensitive topics?

Another major concern in transitioning to phone surveys was to ensure beneficiaries would be in a private and safe environment when answering our 30-minute-long survey. We knew that asking people private questions about topics like health and personal finance can be tricky, particularly when there might be others nearby listening. Again, verifying privacy and building trust before getting to sensitive questions is easier in person, but we adapted. In the end, despite the additional time it would require, we included a set of questions at the beginning of the survey to give the beneficiary time to build rapport with the interviewer and find a comfortable place to speak. It turned out very few respondents needed to find a new place, but this helped us assure we wouldn’t be excluding any participants because of concerns about being overheard.
 
Lesson learned: Allow more time at the beginning of the survey to build trust with the beneficiary so that they feel safe and open to discuss key confidential information in a private location.

How can we ensure the beneficiary can complete the survey on the phone?

We had concerns from the outset about people having the stamina and interest for 30 minutes of answering questions, but as we begun we discovered many had physical or mental limitations, like hearing loss, that got in the way (of those who did, 87 percent reported a physical condition, and 1 percent a mental/psychological one). So we added a set of questions early on to allow them to include a helper, typically a family member to help or be an intermediary between the respondent and phone interviewer. By the end of the data collection, nearly 25 percent of respondents had availed themselves of the option to receive help.
 
Lesson learned: Be sure to take into account the population you’re sampling and any disabilities or difficulties they may have with the survey. Design the survey itself and data collection methodology around the needs of the survey population.

What does this mean to our data and research?

Though we faced challenges transitioning to a phone survey during the pandemic, our additional steps ensured that we reached the most vulnerable of populations and addressed our biggest concerns. Our work and findings are meant to both improve the digitization process for this specific program in Bangladesh, but also to provide broader lessons for financial inclusion and incorporating new methodologies into future research projects.
 
In our next post, we will share the findings from this monitoring survey in more detail—stay tuned!
 
December 29, 2020