Nothing Special   »   [go: up one dir, main page]

skip to main content
article

An Experiment in Comparing Human-Computation Techniques

Published: 01 September 2012 Publication History

Abstract

Human computation can address complex computational problems by tapping into large resource pools for relatively little cost. Two prominent human-computation techniques — games with a purpose (GWAP) and microtask crowdsourcing — can help resolve semantic-technology-related tasks, including knowledge representation, ontology alignment, and semantic annotation. To evaluate which approach is better with respect to costs and benefits, the authors employ categorization challenges in Wikipedia to ultimately create a large, general-purpose ontology. They first use the OntoPronto GWAP, then replicate its problem-solving setting in Amazon Mechanical Turk, using a similar task-design structure, evaluation mechanisms, and input data.

Cited By

View all
  • (2023)Eye into AI: Evaluating the Interpretability of Explainable AI Techniques through a Game with a PurposeProceedings of the ACM on Human-Computer Interaction10.1145/36100647:CSCW2(1-22)Online publication date: 4-Oct-2023
  • (2018)Is Virtual Citizen Science A Game?ACM Transactions on Social Computing10.1145/32099601:2(1-39)Online publication date: 27-Jun-2018
  • (2016)Crowd-based ontology engineering with the uComp Protégé pluginSemantic Web10.3233/SW-1501817:4(379-398)Online publication date: 1-Jan-2016
  • Show More Cited By
  1. An Experiment in Comparing Human-Computation Techniques

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image IEEE Internet Computing
    IEEE Internet Computing  Volume 16, Issue 5
    September 2012
    97 pages

    Publisher

    IEEE Educational Activities Department

    United States

    Publication History

    Published: 01 September 2012

    Author Tags

    1. Electronic publishing
    2. Encyclopedias
    3. GWAP
    4. Games
    5. Internet
    6. Mechanical Turk
    7. Ontologies
    8. Protons
    9. conceptual modeling
    10. crowdsourcing
    11. incentives
    12. motivators
    13. ontology

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 21 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Eye into AI: Evaluating the Interpretability of Explainable AI Techniques through a Game with a PurposeProceedings of the ACM on Human-Computer Interaction10.1145/36100647:CSCW2(1-22)Online publication date: 4-Oct-2023
    • (2018)Is Virtual Citizen Science A Game?ACM Transactions on Social Computing10.1145/32099601:2(1-39)Online publication date: 27-Jun-2018
    • (2016)Crowd-based ontology engineering with the uComp Protégé pluginSemantic Web10.3233/SW-1501817:4(379-398)Online publication date: 1-Jan-2016
    • (2016)Reward Systems in Human Computation GamesProceedings of the 2016 Annual Symposium on Computer-Human Interaction in Play10.1145/2967934.2968083(266-275)Online publication date: 16-Oct-2016
    • (2015)Improving Paid Microtasks through Gamification and Adaptive Furtherance IncentivesProceedings of the 24th International Conference on World Wide Web10.1145/2736277.2741639(333-343)Online publication date: 18-May-2015
    • (2013)Games with a Purpose or Mechanised Labour?Proceedings of the 13th International Conference on Knowledge Management and Knowledge Technologies10.1145/2494188.2494210(1-8)Online publication date: 4-Sep-2013
    • (2013)Social machinesProceedings of the 22nd International Conference on World Wide Web10.1145/2487788.2488074(885-890)Online publication date: 13-May-2013

    View Options

    View options

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media