Tips for Collecting Surveys of Hard-to-Reach Populations
Editors note: This cross-post originally appeared on the Development Impact Blog.
Recently, in joint work with Ana María Ibáñez, Andrés Moya, María Ortega, Marisol Rodríguez, and field support from IPA Colombia, I have been working on a research project that examines the effects of a large amnesty on migrants’ lives. The context of the study is the opportunity of regularization of approximately half a million Venezuelan refugees in Colombia, known as the PEP program (Permiso Especial de Permanencia). To execute the project, we collected phone surveys of regular and irregular migrants. Irregular migrants represent those individuals who are living in Colombia without the proper migratory documentation.
This was a challenging project: we were trying to find a representative sample of irregular Venezuelan migrants—a population with high trust issues, in the middle of the COVID-19 pandemic, and through phone calls. The good news is that we managed to complete the project and collected 3,455 surveys of migrants’ households. The surveys are representative of the regularized population and of irregular migrants arriving to Colombia between January of 2017 and December of 2018.
In the data collection process, we learned important lessons that may help other researchers and practitioners collect data for vulnerable populations with trust issues. Here is a list of the main lessons:
1. Use qualitative studies prior to launching the survey pilots
Before launching the piloting stage of the survey, we conducted 42 semi-structured interviews with migrants to talk about the study and the instrument. The qualitative study revealed relevant lessons for the survey design. First, it signaled immense trust issues—for example, migrants were afraid of being deported—and that contact through other migrants could facilitate and build trust. Migrants were also subject to scams and fake information sent through WhatsApp, Facebook, or to their phones and were frequently contacted. This rendered them distrustful and could affect their willingness to participate in surveys and provide true information on the phone on sensitive topics such as their migratory status, their income or questions on integration. Migrants also reported that often local authorities and NGOs produce information in a language that was difficult to understand for them. Despite the fact that Venezuelans and Colombians share common language, there are important differences in their day-to-day choice of words. The lessons from these interviews were central in our success and shaped many of the corrective actions I will describe below.
2. Work through local organizations
Our second step was to disseminate the objectives of our study with local migrant NGOs. We worked with those organizations so that they were informed about our study in case they received questions from migrants about it. After learning about the study several of these local organizations also facilitated the contact data for their affiliated migrants. This was useful for building our sampling frame.
3. Increase sample by asking for referrals
We learned from other studies that collect samples for drug users or sex workers, that we could implement some elements of snowball sampling techniques in our own study. The methodology begins constructing a sample from a small number of seeds (individuals) as diverse and representative of the targeted population as possible and asks those seeds for a determined number of referrals. Those referrals then are asked for subsequent referrals and by iterating this process the sample is constructed. The chain stops when the researchers have a representative sample of the targeted population. Importantly, we also learned that some studies which collect data of hard-to-reach populations, use a two-part incentive in which the individual is compensated for answering the survey and also for every survey completed from their referrals.
Although we did not use the snowballing methodology to construct our sample, we did use some of its elements like for example our sampling frame was constructed from the combination of the data facilitated by the migrant’s organizations and referrals of every individual we contacted in the screening process. We then used these combined data to select a representative sample of our target population. Because the new referrals modify the sampling frame at each stage, this process requires a dynamic sample selection: as new referrals are received the residual representative sample is updated.
4. Cross check survey language
This is probably more common to other contexts, but something worth mentioning. Despite the fact that Venezuelan and Colombian natives speak Spanish they both use different day-to-day words. To account for this, we asked Venezuelan migrants to review the survey instrument. This exercise allowed us to identify questions that were difficult to understand, were sensitive in the Venezuelan context, and which may prompt negative reactions and comprise the survey completion. For example, we learned that Venezuelans felt odd and distant when referred to as “usted” and preferred the word “tu”. Although both words represent the same meaning in Spanish, “usted” is generally understood as more formal than “tu” in Colombia. Venezuelans, however, generally felt that “usted” was used to talk to people you do not trust. We also had to modify words used to refer to the geographic location of migrants (e.g., “municipio” for “ciudad”), their education level (some education levels were considered formal in Venezuela and not in Colombia), and the words used to ask whether migrants were able to certify their education in Colombia (e.g., “homologación” vs “convalidación”), among other changes.
5. Hire enumerators of the targeted group
We also learned that response rates were higher when migrants recognize a familiar accent on the other side of the phone. Hence, we hired Venezuelan migrants as enumerators. We also learned that enumerators from the same group were deeply committed as they understood the importance of the study for their communities. All enumerators were migrants themselves, so they also were able to connect and transmit empathy with respondents more effectively.
6. Caring for enumerators mental health
One important lesson brought to the group by our co-author, Andrés Moya, who has ample experience working in mental health issues with internally displaced populations in Colombia, was that because the reality of refugees (or for that matter vulnerable populations in general) is hard, oftentimes enumerator’s mental health is affected when collecting surveys. Hence the IPA team, conducted trainings with enumerators to prevent vicarious trauma or secondary traumatic stress, formally understood in this context as the indirect trauma that can result when enumerators are exposed to difficult situations second-hand from survey respondents. The training included sessions with a psychologist who gave enumerators tools to prevent getting physiologically charged and also offered tips to be more empathetic with respondents. Interestingly, since enumerators were migrants themselves, they reported that these tools were very helpful for them to also overcome traumas from their arrival to Colombia.
7. Phone surveys lessons
We also implemented lessons from other studies carrying phone surveys to increase response rates. Standard practices in phone surveys include calling respondents different times of the day, different days, multiple times, and shorten the surveys. A comprehensive summary of standard practices for phone surveys can be found here.
All in all, I truly hope these lessons are useful to inform and facilitate more studies in population groups that are often overlooked by economic research because of the lack of existing data.