Nothing Special   »   [go: up one dir, main page]

IDEAS home Printed from https://ideas.repec.org/a/eee/beexfi/v17y2018icp22-27.html
   My bibliography  Save this article

Prolific.ac—A subject pool for online experiments

Author

Listed:
  • Palan, Stefan
  • Schitter, Christian
Abstract
The number of online experiments conducted with subjects recruited via online platforms has grown considerably in the recent past. While one commercial crowdworking platform – Amazon’s Mechanical Turk – basically has established and since dominated this field, new alternatives offer services explicitly targeted at researchers. In this article, we present www.prolific.ac and lay out its suitability for recruiting subjects for social and economic science experiments. After briefly discussing key advantages and challenges of online experiments relative to lab experiments, we trace the platform’s historical development, present its features, and contrast them with requirements for different types of social and economic experiments.

Suggested Citation

  • Palan, Stefan & Schitter, Christian, 2018. "Prolific.ac—A subject pool for online experiments," Journal of Behavioral and Experimental Finance, Elsevier, vol. 17(C), pages 22-27.
  • Handle: RePEc:eee:beexfi:v:17:y:2018:i:c:p:22-27
    DOI: 10.1016/j.jbef.2017.12.004
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S2214635017300989
    Download Restriction: no

    File URL: https://libkey.io/10.1016/j.jbef.2017.12.004?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Benndorf, Volker & Moellers, Claudia & Normann, Hans-Theo, 2017. "Experienced vs. inexperienced participants in the lab: Do they behave differently?," DICE Discussion Papers 251, Heinrich Heine University Düsseldorf, Düsseldorf Institute for Competition Economics (DICE).
    2. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    3. Landers, Richard N. & Behrend, Tara S., 2015. "An Inconvenient Truth: Arbitrary Distinctions Between Organizational, Mechanical Turk, and Other Convenience Samples," Industrial and Organizational Psychology, Cambridge University Press, vol. 8(2), pages 142-164, June.
    4. Antonio A. Arechar & Simon Gächter & Lucas Molleman, 2018. "Conducting interactive experiments online," Experimental Economics, Springer;Economic Science Association, vol. 21(1), pages 99-131, March.
    5. Urs Fischbacher & Franziska Föllmi-Heusi, 2013. "Lies In Disguise—An Experimental Study On Cheating," Journal of the European Economic Association, European Economic Association, vol. 11(3), pages 525-547, June.
    6. Johannes Abeler & Armin Falk & Lorenz Goette & David Huffman, 2011. "Reference Points and Effort Provision," American Economic Review, American Economic Association, vol. 101(2), pages 470-492, April.
    7. Ofra Amir & David G Rand & Ya'akov Kobi Gal, 2012. "Economic Games on the Internet: The Effect of $1 Stakes," PLOS ONE, Public Library of Science, vol. 7(2), pages 1-4, February.
    8. Neil Stewart & Christoph Ungemach & Adam J. L. Harris & Daniel M. Bartels & Ben R. Newell & Gabriele Paolacci & Jesse Chandler, "undated". "The Average Laboratory Samples a Population of 7,300 Amazon Mechanical Turk Workers," Mathematica Policy Research Reports f97b669c7b3e4c2ab95c9f805, Mathematica Policy Research.
    9. Ben Greiner, 2015. "Subject pool recruitment procedures: organizing experiments with ORSEE," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 1(1), pages 114-125, July.
    10. Matthew J C Crump & John V McDonnell & Todd M Gureckis, 2013. "Evaluating Amazon's Mechanical Turk as a Tool for Experimental Behavioral Research," PLOS ONE, Public Library of Science, vol. 8(3), pages 1-18, March.
    11. Volker Benndorf & Claudia Moellers & Hans-Theo Normann, 2017. "Experienced vs. inexperienced participants in the lab: do they behave differently?," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 3(1), pages 12-25, July.
    12. Marreiros, Helia & Tonin, Mirco & Vlassopoulos, Michael & Schraefel, M.C., 2017. "“Now that you mention it”: A survey experiment on information, inattention and online privacy," Journal of Economic Behavior & Organization, Elsevier, vol. 140(C), pages 1-17.
    13. repec:cup:judgdm:v:5:y:2010:i:5:p:411-419 is not listed on IDEAS
    14. Siddharth Suri & Duncan J Watts, 2011. "Cooperation and Contagion in Web-Based, Networked Public Goods Experiments," PLOS ONE, Public Library of Science, vol. 6(3), pages 1-18, March.
    15. Cooper, David J., 2014. "A Note on Deception in Economic Experiments," Journal of Wine Economics, Cambridge University Press, vol. 9(02), pages 111-114, August.
    16. repec:cup:judgdm:v:10:y:2015:i:5:p:479-491 is not listed on IDEAS
    17. Chen, Daniel L. & Schonger, Martin & Wickens, Chris, 2016. "oTree—An open-source platform for laboratory, online, and field experiments," Journal of Behavioral and Experimental Finance, Elsevier, vol. 9(C), pages 88-97.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Roggenkamp, Hauke C., 2024. "Revisiting ‘Growth and Inequality in Public Good Provision’—Reproducing and Generalizing Through Inconvenient Online Experimentation," OSF Preprints 6rn97, Center for Open Science.
    2. Antonio A. Arechar & Simon Gächter & Lucas Molleman, 2018. "Conducting interactive experiments online," Experimental Economics, Springer;Economic Science Association, vol. 21(1), pages 99-131, March.
    3. Benndorf, Volker & Rau, Holger A. & Sölch, Christian, 2019. "Minimizing learning in repeated real-effort tasks," Journal of Behavioral and Experimental Finance, Elsevier, vol. 22(C), pages 239-248.
    4. Hyndman, Kyle & Walker, Matthew J., 2022. "Fairness and risk in ultimatum bargaining," Games and Economic Behavior, Elsevier, vol. 132(C), pages 90-105.
    5. Horváth, Gergely, 2023. "Peer effects through receiving advice in job search: An experimental study," Journal of Economic Behavior & Organization, Elsevier, vol. 216(C), pages 494-519.
    6. Irene Maria Buso & Daniela Di Cagno & Lorenzo Ferrari & Vittorio Larocca & Luisa Lorè & Francesca Marazzi & Luca Panaccione & Lorenzo Spadoni, 2021. "Lab-like findings from online experiments," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 7(2), pages 184-193, December.
    7. Jonathan Schulz & Uwe Sunde & Petra Thiemann & Christian Thoeni, 2019. "Selection into Experiments: Evidence from a Population of Students," Discussion Papers 2019-09, The Centre for Decision Research and Experimental Economics, School of Economics, University of Nottingham.
    8. Thiemann, Petra & Schulz, Jonathan & Sunde, Uwe & Thöni, Christian, 2022. "Selection into experiments: New evidence on the role of preferences, cognition, and recruitment protocols," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 98(C).
    9. Irene Maria Buso & Daniela Di Cagno & Sofia De Caprariis & Lorenzo Ferrari & Vittorio Larocca & Francesca Marazzi & Luca Panaccione & Lorenzo Spadoni, 2020. "The Show Must Go On: How to Elicit Lablike Data on the Effects of COVID-19 Lockdown on Fairness and Cooperation," Working Papers CESARE 2/2020, Dipartimento di Economia e Finanza, LUISS Guido Carli.
    10. Dato, Simon & Feess, Eberhard & Nieken, Petra, 2019. "Lying and reciprocity," Games and Economic Behavior, Elsevier, vol. 118(C), pages 193-218.
    11. Nicolas Jacquemet & Alexander G James & Stéphane Luchini & James J Murphy & Jason F Shogren, 2021. "Do truth-telling oaths improve honesty in crowd-working?," PLOS ONE, Public Library of Science, vol. 16(1), pages 1-18, January.
    12. Wladislaw Mill & Cornelius Schneider, 2023. "The Bright Side of Tax Evasion," CESifo Working Paper Series 10615, CESifo.
    13. Chapkovski, Philipp, 2023. "Conducting interactive experiments on Toloka," Journal of Behavioral and Experimental Finance, Elsevier, vol. 37(C).
    14. Irene Maria Buso & Daniela Di Cagno & Sofia De Caprariis & Lorenzo Ferrari & Vittorio Larocca & Luisa Lorè & Francesca Marazzi & Luca Panaccione & Lorenzo Spadoni, 2020. "Lab-like Findings of Non-Lab Experiments: a Methodological Proposal and Validation," Working Papers CESARE 3/2020, Dipartimento di Economia e Finanza, LUISS Guido Carli.
    15. Guo, Yiting & Shachat, Jason & Walker, Matthew J. & Wei, Lijia, 2023. "On the generalizability of using mobile devices to conduct economic experiments," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 106(C).
    16. Shaun P. Hargreaves Heap & Eugenio Levi & Abhijit Ramalingam, 2021. "Group identification and giving: in-group love, out-group hate and their crowding out," MUNI ECON Working Papers 2021-07, Masaryk University, revised Feb 2023.
    17. Chapkovski, Philipp, 2022. "Interactive Experiments in Toloka," EconStor Preprints 249771, ZBW - Leibniz Information Centre for Economics.
    18. Marcus Giamattei & Kyanoush Seyed Yahosseini & Simon Gächter & Lucas Molleman, 2020. "LIONESS Lab: a free web-based platform for conducting interactive experiments online," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 6(1), pages 95-111, June.
    19. Heinicke, Franziska & Rosenkranz, Stephanie & Weitzel, Utz, 2019. "The effect of pledges on the distribution of lying behavior: An online experiment," Journal of Economic Psychology, Elsevier, vol. 73(C), pages 136-151.
    20. Simon Gächter & Lingbo Huang & Martin Sefton, 2016. "Combining “real effort” with induced effort costs: the ball-catching task," Experimental Economics, Springer;Economic Science Association, vol. 19(4), pages 687-712, December.

    More about this item

    Keywords

    Prolific; Online experiment; Subject pool;
    All these keywords.

    JEL classification:

    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • C81 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - Methodology for Collecting, Estimating, and Organizing Microeconomic Data; Data Access
    • B41 - Schools of Economic Thought and Methodology - - Economic Methodology - - - Economic Methodology

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:beexfi:v:17:y:2018:i:c:p:22-27. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: https://www.journals.elsevier.com/journal-of-behavioral-and-experimental-finance .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.