Phone Surveys for Randomized Experiments
During the interviews, the common concerns raised regarding budget were related to the cost of data collection and expert time. Therefore, the first section of this series of articles will focus on some of the tools and methodologies that can be used to lower the cost of data collection without lowering the quality of the data.
The Current Landscape of Phone surveys
Data collection is a significant cost while conducting Randomized Control Trials (RCTs). Most of the time, the data required to answer the question under discussion is not currently available either due to size, frequency, missing variables, etc. (Cole et al., 2020). This list goes on as each case is specific to their given condition. However, some data collection methodologies such as administrative data and phone surveys can be utilized more effectively to make RCTs faster and more frugal.
In-person surveys have been regarded more highly than online or phone surveys because it is believed that the result of in-person surveys are more representative of reality and also they have a higher reply rate. However in-person surveys are more costly compared to automated surveying methods because you will need more trained professionals in the field to conduct the surveys. Additionally, there will be costs related to expenses in the field, such as transportation, translators, accommodation, etc. In other words, in-person surveys are more time consuming and costly as they require higher monetary investment in human resources.
In a meta-analysis that was conducted in the USA in 1999, the researchers concluded that phone survey samples were skewed as they represent “greater proportions of the well-educated, the wealthy, and whites (Ellis, & Krosnick, 1999).” Lack of accessibility to phone lines and refusal to participate in a study were possible reasons for demographic under-representation of low-income groups and non-whites but the authors could only find little empirical evidence to support their second claim (Ellis, & Krosnick, 1999). On the bright side, with the increased access to mobile phones in the last decades, phone and web-based surveys come with innovative ways to collect high-quality, low-cost survey data (Holloway, Personal interview). For example, online platforms such as Surveda can be used to create phone-based, web-based, or mixed-mode surveys (InSTEDD). Bloomberg Philanthropies Data For Health Initiative uses Surveda to create a phone-based survey to collect data on non-communicable diseases (NCDs). A research that is conducted using Surveda concluded that the countries which had higher cell phone usage generated more demographically representative data which can be seen in figure 1 (Leo et al., 2015). However, some groups can still be underrepresented. Even though a greater part of Afghanistan’s population is using mobile phones, rural women did not adapt to using mobile phones as fast as the rest of the population; thus, they are underrepresented in the data. A possible solution is to use weighting to conform the sample demographics to the population parameters. Also, if these kinds of discrepancies are observed before the study starts, researchers can try to find alternative ways to represent underrepresented populations by using different mediums.
Phone surveys started to attract more attention due to social distancing measures during the Covid-19 pandemic. The on-going projects needed an alternative data collection methodology due to new constraints. Also, governments and NGOs needed to evaluate and respond to the current crisis at a fast pace to protect the physical and mental well beings of the people. The Living Standards Measurement Study (LSMS) at the World Bank supported high-frequency phone surveys on COVID-19 in Burkina Faso, Ethiopia, Malawi, Mali, Nigeria to evaluate the change in access to necessities, labor, schooling and other life standard measurements (World Bank, 2021). For example in Ethiopia, they were able to complete the data collection in 20 days while covering 3,249 households (Wieser, et al. 2020). As they attracted more attention, there has been more resources on how to conduct phone surveys in a better way. For example, Innovations for Poverty Action (IPA) published a handbook on how to conduct remote surveys during a pandemic where they compared and contrasted different phone-based surveying methodologies for different needs (Glazerman et al., 2020).
Additionally, there has been more rigorous research that evaluated the quality of the data from phone surveys. Garlick, Orkin, & Quinn (2020) conducted “the first randomized controlled trial to compare microenterprise data from surveys of different frequencies” and mediums such as in-person and phone-based. Over three months, they interviewed the same microenterprise twelve times. They concluded that there were no substantial differences for key outcomes amongst the different mediums and frequencies. Additionally, using phone surveys did not lower or raise the quality of data for cross-sectional or static panel models. Lastly, phone surveys cost less than in-person surveys however they did not report the specific difference in the cost (Garlick, Orkin, & Quinn, 2020).
Different modes of phone-based surveys
Even though we refer to phone surveys as low-cost alternatives throughout this piece, we also need to acknowledge the need for setting up a call center and training the staff to navigate the computer based-survey to avoid problems in data entry and management. Even though there is a fixed cost associated with it, the marginal cost is relatively low. Thus, it is beneficial if you have a bigger sample size or if you would like to perform high-frequency interviews (American Association for Public Opinion Research). Therefore, in the long run, the cost-effectiveness of the phone-based surveys will increase after the initial investment.
Firstly we need to acknowledge the different types of mobile phone data collections. All of these types have advantages and disadvantages thus we will discuss the advantages and disadvantages of different methods to inform your decision for a data collection design. The four automated survey methodologies that we will focus on are (Glazerman et al., 2020);
- Computer-Assisted Telephone Interviewing (CATI), type of telephone surveys with an enumerator and computerized data collection
- Interactive Voice Response (IVR) surveys, self-administered pre-recorded surveys where the participant responds either through keystrokes or vocal reply.
- Short Message Service (SMS) surveys self-administered SMS surveys where the participant responds via replying to the SMS
- Web surveys self-administered surveys that are accessible through the internet
Computer-Assisted Telephone Interviewing (CATI):
CATI is the most costly phone surveying methodology as it is administered by an interviewer. First of all, it requires higher capital due to airtime, electricity, and enumerator training. Additionally, it is more time-consuming compared to other methodologies due to its length. However, it is also more suitable for achieving higher response rates and using longer questionnaires due to its interactive nature. Therefore, it might be a suitable option for a baseline survey, rather than a follow-up survey.
Interactive Voice Response (IVR) surveys:
IVR surveys are less expensive than CATI due to the shorter air time and self-administrated nature of it. However, it is still more expensive than SMS surveys. One of the main advantages of IVR surveys over SMS surveys is that they do not require literacy by the participant and can be used to identify the primary language that is used by the respondents.
Web surveys
Even though it is the cheapest option, there are concerns regarding the representativeness of the sample of the web surveys due to the infrastructures that they require to be conducted by the participants (Glazerman et al., 2020).
The following table which is taken from the World Bank Blog summarizes the characteristics of different modes of mobile phone data collection (Himelein et al., 2020). In this article, we will focus on these three modes and evaluate their advantages and disadvantages based on case studies.
Here are some examples of the pricing of different modes of phone-based surveying which was retrieved from the blog post Himelein et al., 2020.
- Ghana 2017: $4.50 per case for a 10-minute CATI survey (L’Engle et al., 2018).
- Nigeria 2017: IVR survey is approximately double the cost of an SMS survey (Lau, 2019).
- Liberia 2015: $1.63 per completed IVR survey, and $22.45 per complete CATI phone survey (Maffioli, 2019).
- Malawi 2015: $5.80 to $8.80 per CATI survey for a 42-question survey (Dabalen, 2016).
Case Studies
“Improving Last-Mile Service Delivery using Phone-Based Monitoring” is in the nimble evaluations portfolio of the World Bank’s SIEF. The study aims to evaluate the effectiveness of a phone-based monitoring system for optimizing the delivery of the cash transfer program to the farmers in the state of Telangana. The study uses stratified randomization within each
district based on how many beneficiaries each office has. Randomly chosen offices received a phone-based monitoring system as a treatment to ensure that all of the checks were delivered to the farmers before the monsoon season. The study concluded that the phone-based monitoring system led to a 7.6% reduction in the number of farmers who didn’t receive the cash transfer. The whole evaluation took three months to run and in six months the final paper was completed (Strategic Impact Evaluation Fund, 2021). In addition to being a relatively fast evaluation, the phone survey cost only 36,000$ which helped the whole evaluation to cost around 50,000$ (Markus, 2019). Given the scale of the evaluation covered the state of Telangana, it is a highly cost-efficient randomized experiment. The researchers also highlighted that “the cost per phone survey is substantially lower than the cost of a field survey, which allowed for a much larger sample size within a fixed budget” (Muralidharan, et al., 2018). In addition, as a robustness check for the quality of the phone survey data, the researchers compared the outcomes of the phone surveys with high-quality administrative data which was obtained from the administrative bank records (Muralidharan, et al., 2018).
Further research
Sensitivity to social desirability bias is one of the main concerns regarding the phone surveys (Garlick, Orkin, & Quinn, 2020; Jäckle, Roberts, & Lynn, 2006; Holbrook, Green, & Krosnick, 2003). The researchers speculate that it might be due to the trust that is established during the in-person interviews that are lacking in phone surveys but there was no empirical evidence to support it. Thus, a cognitive study on how in-person versus phone surveys affect the social desirability bias needs to be conducted. In addition, this behavior pattern might change in the future as different generations use different cues to establish trust and have different levels of comfortability with technology. A possible solution can be identifying questions that are more likely to be subject to social desirability bias and investigate alternative ways to gather this information. Also one of the above-mentioned studies used random digit dialing thus we can also speculate that the randomly selected sample didn’t know much about the study, thus were less invested in it (Holbrook, Green, & Krosnick, 2003). Therefore, identifying the eligible sample carefully and communicating the importance of the study might improve the quality of data.
Resources:
American Association for Public Opinion Research. (n.d.). Costs in RDD cell phone surveys. Retrieved March 20, 2021, from https://www.aapor.org/Education-Resources/Reports/Cell-Phone-Task-Force-Report/Costs.aspx
Ellis, C. H., & Krosnick, J. A. (1999). Comparing telephone and face to face surveys in terms of sample representativeness: a Meta-Analysis of Demographics Characteristics. Ann Arbor: Universidad de Michigan, NES. Retrieved from http://users.clas.ufl.edu/kenwald/pos6757/spring02/tch59.pdf
Garlick, R., Orkin, K., & Quinn, S. (2020). Call me maybe: experimental evidence on frequency and medium effects in microenterprise surveys. The World Bank Economic Review, 34(2), 418–443. Retrieved from doi: 10.1093/wber/lhz021
Glazerman, S., Rosenbaum, M., Sandino, R., & Shaughnessy, L. (2020) Remote Surveying in a Pandemic: Handbook. Retrieved from https://www.poverty-action.org/sites/default/files/publications/IPA-Phone-Surveying-in-a-Pandemic-Handbook.pdf
Dabalen, A., Etang, A., Hoogeveen, J., Mushi, E., Schipper, Y., & von Engelhardt, J. (2016). Mobile phone panel surveys in developing countries: a practical guide for microdata collection. The World Bank. Retrieved from https://openknowledge.worldbank.org/bitstream/handle/10986/24595/9781464809040.pdf
Holbrook, A. L., Green, M. C., & Krosnick, J. A. (2003). Telephone versus face-to-face interviewing of national probability samples with long questionnaires: Comparisons of respondent satisficing and social desirability response bias. Public opinion quarterly, 67(1), 79–125. Retrieved from https://academic.oup.com/poq/article-abstract/67/1/79/1873914
Jäckle, A., Roberts, C., & Lynn, P. (2006). Telephone versus face-to-face interviewing: mode effects on data quality and likely causes: report on phase II of the ESS-Gallup mixed mode methodology project (№2006–41). ISER Working Paper Series. Retrieved from https://www.iser.essex.ac.uk/files/iser_working_papers/2006-41.pdf
InSTEDD. (n.d.). InSTEDD Surveda. Retrieved March 20, 2021, from https://surveda.instedd.org/
L’Engle, K., Sefa, E., Adimazoya, E. A., Yartey, E., Lenzi, R., Tarpo, C., … & Ampeh, Y. (2018). Survey research with a random digit dial national mobile phone sample in Ghana: Methods and sample quality. PloS one, 13(1), e0190902. Retrieved from https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0190902
Lau, C. Q., Cronberg, A., Marks, L., & Amaya, A. (2019, December). In search of the optimal mode for mobile phone surveys in developing countries. A comparison of IVR, SMS, and CATI in NIgeria. In Survey Research Methods (Vol. 13, №3, pp. 305–318). Retrieved from https://ojs.ub.uni-konstanz.de/srm/article/view/7375
Leo, B., Morello, R., Mellon, J., Peixoto, T., & Davenport, S., (2015) Do Mobile Phone Surveys Work in Poor Countries?. Bloomberg Philanthropies Data for Health Initiative. Retrieved from https://7a26dccb-8121-4cfe-9b54-5107ddb480e8.usrfiles.com/ugd/7a26dc_ea345a3addc84e87a38e7a6febe2e561.pdf
Maffioli, E. M. (2019). Relying Solely on Mobile Phone Technology: Sampling and Gathering Survey Data in Challenging Settings. Retrieved from https://elisamaffioli.files.wordpress.com/2020/03/maffioli_method.pdf
Markus, A. (2019, April). Testing the impact of phone calls on service delivery in India. Retrieved February 20, 2021, from https://blogs.worldbank.org/developmenttalk/testing-impact-phone-calls-service-delivery-india
Muralidharan, K., Niehaus, P., Sukhtankar, S., & Weaver, J. (2018). Improving last-mile service delivery using phone-based monitoring (No. w25298). National Bureau of Economic Research.
Strategic Impact Evaluation Fund. (2021). SIEF Event | Learning from nimble evaluations [video file]. Retrieved from https://vimeo.com/511611407
The World Bank. (2021). LSMS-Supported high-frequency phone surveys On covid-19. Retrieved March 20, 2021, from https://www.worldbank.org/en/programs/lsms/brief/lsms-launches-high-frequency-phone-surveys-on-covid-19#2
Wieser, C., Ambel, A. A., Bundervoet, T., Tsegay, A. H. (2020). Monitoring COVID-19 Impacts on Households in Ethiopia (Vol. 3) : Results from a High-Frequency Phone Survey of Households (English). Monitoring COVID-19 Impacts on Households in Ethiopia Washington, D.C. : World Bank Group. http://documents.worldbank.org/curated/en/392191591031322656/Results-from-a-High-Frequency-Phone-Survey-of-Households