Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2809563.2809608acmotherconferencesArticle/Chapter ViewAbstractPublication Pagesi-knowConference Proceedingsconference-collections
research-article

MicroTrails: comparing hypotheses about task selection on a crowdsourcing platform

Published: 21 October 2015 Publication History

Abstract

To optimize the workflow on commercial crowdsourcing platforms like Amazon Mechanical Turk or Microworkers, it is important to understand how users choose their tasks. Current work usually explores the underlying processes by employing user studies based on surveys with a limited set of participants. In contrast, we formulate hypotheses based on the different findings in these studies and, instead of verifying them based on user feedback, we compare them directly on data from a commercial crowdsourcing platform. For evaluation, we use a Bayesian approach called HypTrails which allows us to give a relative ranking of the corresponding hypotheses. The hypotheses considered, are for example based on task categories, monetary incentives or semantic similarity of task descriptions. We find that, in our scenario, hypotheses based on employers as well the the task descriptions work best.
Overall, we objectively compare different factors influencing users when choosing their tasks. Our approach enables crowdsourcing companies to better understand their users in order to optimize their platforms, e.g., by incorparting the gained knowledge about these factors into task recommentation systems.

References

[1]
H. Aris. Influencing factors in mobile crowdsourcing participation: A review of empirical studies. In Conference on Computer Science and Computational Modelling, 2014.
[2]
R. Baeza-Yates, B. Ribeiro-Neto, et al. Modern information retrieval, volume 463. ACM press New York, 1999.
[3]
L. B. Chilton, J. J. Horton, R. C. Miller, and S. Azenkot. Task search in a human computation market. In Workshop on Human Computation, 2010.
[4]
T. Hoßfeld, M. Hirth, and P. Tran-Gia. Modeling of crowdsourcing platforms and granularity of work organization in future internet. In International Teletraffic Congress, 2011.
[5]
P. G. Ipeirotis. Analyzing the amazon mechanical turk marketplace. XRDS: Crossroads, The ACM Magazine for Students, 17(2):16--21, 2010.
[6]
R. E. Kass and A. E. Raftery. Bayes factors. Journal of the American Statistical Association, 90(430):773--795, 1995.
[7]
N. Kaufmann, T. Schulze, and D. Veit. More than fun and money. worker motivation in crowdsourcing-a study on mechanical turk. In Americas Conference on Information Systems, 2011.
[8]
A. Kittur, J. V. Nickerson, M. Bernstein, E. Gerber, A. Shaw, J. Zimmerman, M. Lease, and J. Horton. The future of crowd work. In Conference on Computer Supported Cooperative Work, 2013.
[9]
S. Schnitzer, C. Rensing, S. Schmidt, K. Borchert, M. Hirth, and P. Tran-Gia. Demands on Task Recommendation in Crowdsourcing Platforms - The Worker's Perspective. In CrowdRec Workshop, 2015.
[10]
T. Schulze, S. Seedorf, D. Geiger, N. Kaufmann, and M. Schader. Exploring task properties in crowdsourcing-an empirical study on mechanical turk. In European Conference on Information Systems, 2011.
[11]
P. Singer, D. Helic, A. Hotho, and M. Strohmaier. Hyptrails: A bayesian approach for comparing hypotheses about human trails on the web. In Conference on World Wide Web, 2015.
[12]
M.-C. Yuen, I. King, and K.-S. Leung. Task recommendation in crowdsourcing systems. In Workshop on Crowdsourcing and Data Mining, 2012.

Cited By

View all
  • (2024)CompTrails: comparing hypotheses across behavioral networksData Mining and Knowledge Discovery10.1007/s10618-023-00996-838:3(1258-1288)Online publication date: 1-May-2024
  • (2017)A Bayesian Method for Comparing Hypotheses About Human TrailsACM Transactions on the Web10.1145/305495011:3(1-29)Online publication date: 23-Jun-2017
  • (2017)Crowdsourcing Quality of Experience ExperimentsEvaluation in the Crowd. Crowdsourcing and Human-Centered Experiments10.1007/978-3-319-66435-4_7(154-190)Online publication date: 28-Sep-2017
  • Show More Cited By

Index Terms

  1. MicroTrails: comparing hypotheses about task selection on a crowdsourcing platform

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    i-KNOW '15: Proceedings of the 15th International Conference on Knowledge Technologies and Data-driven Business
    October 2015
    314 pages
    ISBN:9781450337212
    DOI:10.1145/2809563
    • General Chairs:
    • Stefanie Lindstaedt,
    • Tobias Ley,
    • Harald Sack
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 21 October 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. HypTrails
    2. crowdsourcing
    3. human trails

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    i-KNOW '15

    Acceptance Rates

    i-KNOW '15 Paper Acceptance Rate 25 of 78 submissions, 32%;
    Overall Acceptance Rate 77 of 238 submissions, 32%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)3
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 22 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)CompTrails: comparing hypotheses across behavioral networksData Mining and Knowledge Discovery10.1007/s10618-023-00996-838:3(1258-1288)Online publication date: 1-May-2024
    • (2017)A Bayesian Method for Comparing Hypotheses About Human TrailsACM Transactions on the Web10.1145/305495011:3(1-29)Online publication date: 23-Jun-2017
    • (2017)Crowdsourcing Quality of Experience ExperimentsEvaluation in the Crowd. Crowdsourcing and Human-Centered Experiments10.1007/978-3-319-66435-4_7(154-190)Online publication date: 28-Sep-2017
    • (2016)FolkTrailsProceedings of the 25th ACM International on Conference on Information and Knowledge Management10.1145/2983323.2983686(2311-2316)Online publication date: 24-Oct-2016

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media