Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2908131.2908140acmconferencesArticle/Chapter ViewAbstractPublication PageswebsciConference Proceedingsconference-collections
tutorial

It's getting crowded!: how to use crowdsourcing effectively for web science research

Published: 22 May 2016 Publication History

Abstract

Since the term crowdsourcing was coined in 2006 [1], we have witnessed a surge in the adoption of the crowdsourcing paradigm. Crowdsourcing solutions are highly sought-after to solve problems that require human intelligence at a large scale. In the last decade there have been numerous applications of crowdsourcing spanning several domains in both research and for practical benefits across disciplines (from sociology to computer science). In the realm of research practice, crowdsourcing has unmistakably broken the barriers of qualitative and quantitative studies by providing a means to scale-up previously constrained laboratory studies and controlled experiments. Today, one can easily build ground truths for evaluation and access potential participants around the clock with diverse demographics at will, all within an unprecedentedly short amount of time. This comes with a number of challenges related to lack of control on research subjects and with respect to data quality.
A core characteristic of Web Science over the last decade has been its interdisciplinary approach to understand the behavior of people on and off the Web, using a wide range of data sources. It is at this confluence that crowdsourcing provides an important opportunity to explore previously unfeasible experimental grounds.
In this tutorial, we will introduce the crowdsourcing paradigm in its entirety. We will discuss altruistic and reward-based crowdsourcing, eclipsing the needs of task requesters, as well as the behavior of crowd workers. The tutorial will focus on paid microtask crowdsourcing, and reflect on the challenges and opportunities that confront us. In an interactive demonstration session, we will run the audience through the entire lifecycle of creating and deploying microtasks on an established crowdsourcing platform, optimizing task settings in order to meet task needs, and aggregating results thereafter. We will present a selection of state-of-the-art methods to ensure high-quality results and inhibit malicious activity. The tutorial will be framed within the context of Web Science. The interdisciplinary nature of Web Science breeds a rich ground for crowdsourcing, and we aim to spread the virtues of this growing field.

Reference

[1]
Jeff Howe. The rise of crowdsourcing. Wired magazine, 14(6):1--4, 2006.

Cited By

View all
  • (2021)Aggregation Techniques in Crowdsourcing: Multiple Choice Questions and BeyondProceedings of the 30th ACM International Conference on Information & Knowledge Management10.1145/3459637.3482032(4842-4844)Online publication date: 26-Oct-2021
  • (2019)In What Mood Are You Today?Proceedings of the 10th ACM Conference on Web Science10.1145/3292522.3326010(373-382)Online publication date: 26-Jun-2019
  • (2017)The Role of MTurk in Education Research: Advantages, Issues, and Future DirectionsEducational Researcher10.3102/0013189X1772551946:6(329-334)Online publication date: 8-Aug-2017

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
WebSci '16: Proceedings of the 8th ACM Conference on Web Science
May 2016
392 pages
ISBN:9781450342087
DOI:10.1145/2908131
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 22 May 2016

Check for updates

Author Tags

  1. crowdsourcing
  2. human computation
  3. web science

Qualifiers

  • Tutorial

Conference

WebSci '16
Sponsor:
WebSci '16: ACM Web Science Conference
May 22 - 25, 2016
Hannover, Germany

Acceptance Rates

WebSci '16 Paper Acceptance Rate 13 of 70 submissions, 19%;
Overall Acceptance Rate 245 of 933 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)1
  • Downloads (Last 6 weeks)0
Reflects downloads up to 14 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2021)Aggregation Techniques in Crowdsourcing: Multiple Choice Questions and BeyondProceedings of the 30th ACM International Conference on Information & Knowledge Management10.1145/3459637.3482032(4842-4844)Online publication date: 26-Oct-2021
  • (2019)In What Mood Are You Today?Proceedings of the 10th ACM Conference on Web Science10.1145/3292522.3326010(373-382)Online publication date: 26-Jun-2019
  • (2017)The Role of MTurk in Education Research: Advantages, Issues, and Future DirectionsEducational Researcher10.3102/0013189X1772551946:6(329-334)Online publication date: 8-Aug-2017

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media