Jorrat, Diego (2020): Recruiting experimental subjects using WhatsApp.
Preview |
PDF
MPRA_paper_101467.pdf Download (1MB) | Preview |
Abstract
The aim of many experiments is to estimate the effect of different interventions on subjects' decision making. However, obtaining large samples and internal validity is challenging. This paper presents an alternative device at almost no cost that can easily provide a very large number of participants (700 in 5 hours). We asked 14 students to invite their WhatsApp contacts to participate in an online experiment. The students created a total of 80 diffusion groups with 25 contacts each. Using the diffusion groups as clusters, we ran a cluster randomization procedure in order to assign subjects to a framing experiment (treatment + control). We obtained the same level of attrition, duplicates and uninvited subjects across the treatment and control groups. Moreover, the experiment yielded consistent results in line with the framing literature.
Item Type: | MPRA Paper |
---|---|
Original Title: | Recruiting experimental subjects using WhatsApp |
English Title: | Recruiting experimental subjects using WhatsApp |
Language: | English |
Keywords: | Recruiting, Online Experiments, Prisoner's Dilemma, Randomization |
Subjects: | C - Mathematical and Quantitative Methods > C8 - Data Collection and Data Estimation Methodology ; Computer Programs C - Mathematical and Quantitative Methods > C9 - Design of Experiments C - Mathematical and Quantitative Methods > C9 - Design of Experiments > C99 - Other D - Microeconomics > D7 - Analysis of Collective Decision-Making > D70 - General |
Item ID: | 101467 |
Depositing User: | Diego Jorrat |
Date Deposited: | 07 Jul 2020 07:19 |
Last Modified: | 07 Jul 2020 07:19 |
References: | Andreoni, J. (1995). Warm-glow versus cold-prickle: The effects of positive and negative framing on cooperation in experiments. The Quarterly Journal of Economics, 110(1):1–21. Brañas-Garza, P. (2007). Promoting helping behavior with framing in dictator games. Journal of Economic Psychology, 28(4):477 – 486. Capraro, V. and Rand, D. G. (2018). Do the right thing: Experimental evidence that preferences for moral behavior, rather than equity or efficiency per se, drive human prosociality. Judgment and Decision Making, 13(1):99–111. Capraro, V., Rodriguez-Lara, I., and Ruiz-Martos, M. J. (2020). Preferences for efficiency, rather than preferences for morality, drive cooperation in the one-shot stag-hunt game. Journal of Behavioral and Experimental Economics, 86:101535. Deaton, A. (2010). Instruments, randomization, and learning about development. Journal of Economic Literature, 48(2):424–55. Dillman, D. A. (2011). Mail and Internet surveys: The tailored design method– 2007 Update with new Internet, visual, and mixed-mode guide. John Wiley & Sons. Engel, C. (2011). Dictator games: A meta study. Experimental Economics, 14(4):583–610. Engel, C. and Rand, D. G. (2014). What does “clean” really mean? the implicit framing of contextualized experiments. Economics Letters, 122(3):386–389. Espín, A. M., Correa, M., and Ruiz-Villaverde, A. (2019). Patience predicts cooperative synergy: The roles of ingroup bias and reciprocity. Journal of Behavioral and Experimental Economics, 83:101465. Espín, A. M., Moreno-Herrero, D., Sánchez-Campillo, J., and Martín, J. A. R. (2018). Do envy and compassion pave the way to unhappiness? social preferences and life satisfaction in a Spanish city. Journal of Happiness Studies, 19(2):443–469. Exadaktylos, F., Espín, A. M., and Branas-Garza, P. (2013). Experimental subjects are not different. Scientific Reports, 3(1):1–6. Fehr, E. and Schmidt, K. M. (1999). A theory of fairness, competition, and cooperation. The Quarterly Journal of Economics, 114(3):817–868. Fischbacher, U. (2007). z-tree: Zurich toolbox for ready-made economic exper- iments. Experimental Economics, 10(2):171–178. Fowler Jr, F. J. (2013). Survey research methods. Sage Publications. Glewwe, P., Kremer, M., Moulin, S., and Zitzewitz, E. (2004). Retrospective vs. prospective analyses of school inputs: the case of flip charts in Kenya. Journal of Development Economics, 74(1):251–268. Goerg, S. J., Rand, D., and Walkowitz, G. (2019). Framing effects in the prisoner’s dilemma but not in the dictator game. Journal of the Economic Science Association. Hoffman, E., McCabe, K., and Smith, V. L. (1996). Social distance and other-regarding behavior in dictator games. The American Economic Review, 86(3):653–660. Horton, J. J., Rand, D. G., and Zeckhauser, R. J. (2011). The online laboratory: conducting experiments in a real labor market. Experimental Economics, 14(3):399–425. Jorrat, D., Ortega, D., and Ronconi, L. (2018). No al gatillo fácil: Experimental evidence from a rational use of force police training program in argentina. Mimeo. Kaplowitz, M. D., Hadlock, T. D., and Levine, R. (2004). A comparison of web and mail survey response rates. Public Opinion Quarterly, 68(1):94–101. Liberman, V., Samuels, S. M., and Ross, L. (2004). The name of the game: Predictive power of reputations versus situational labels in determining prisoner’s dilemma game moves. Personality and Social Psychology Bulletin, 30(9):1175–1185. Miguel, E. and Kremer, M. (2004). Worms: identifying impacts on educa- tion and health in the presence of treatment externalities. Econometrica, 72(1):159–217. Millar, M. M. and Dillman, D. A. (2011). Improving response to web and mixed-mode surveys. Public Opinion Quarterly, 75(2):249–269. Rand, D. G. (2012). The promise of mechanical turk: How online labor markets can help theorists run behavioral experiments. Journal of Theoretical Biology, 299:172–179. Rubin, D. B. (1974). Estimating causal effects of treatments in randomized and nonrandomized studies. Journal of Educational Psychology, 66(5):688. Zizzo, D. J. (2010). Experimenter demand effects in economic experiments. Experimental Economics, 13(1):75–98. |
URI: | https://mpra.ub.uni-muenchen.de/id/eprint/101467 |