Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2348283.2348430acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
demonstration

CrowdTerrier: automatic crowdsourced relevance assessments with terrier

Published: 12 August 2012 Publication History

Abstract

In this demo, we present CrowdTerrier, an infrastructure extension to the open source Terrier IR platform that enables the semi-automatic generation of relevance assessments for a variety of document ranking tasks using crowdsourcing. The aim of CrowdTerrier is to reduce the time and expertise required to effectively Crowdsource relevance assessments by abstracting away from the complexities of the crowdsourcing process. It achieves this by automating the assessment process as much as possible, via a close integration of the IR system that ranks the documents (Terrier) and the crowdsourcing marketplace that is used to assess those documents (Amazon's Mechanical Turk).

References

[1]
R. McCreadie, C. Macdonald and I. Ounis. Identifying Top News using Crowdsourcing. Information Retrieval, 2012.
[2]
I. Ounis, G. Amati, V. Plachouras, B. He, C. Macdonald and C. Lioma. Terrier: A high performance and scalable information retrieval platform. In Proc. OSIR Workshop 2006.

Cited By

View all
  • (2013)Crowdsourcing for information retrieval: introduction to the special issueInformation Retrieval10.1007/s10791-013-9222-716:2(91-100)Online publication date: 26-Mar-2013

Index Terms

  1. CrowdTerrier: automatic crowdsourced relevance assessments with terrier

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SIGIR '12: Proceedings of the 35th international ACM SIGIR conference on Research and development in information retrieval
    August 2012
    1236 pages
    ISBN:9781450314725
    DOI:10.1145/2348283

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 12 August 2012

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. crowdsourcing
    2. relevance assessment
    3. terrier

    Qualifiers

    • Demonstration

    Conference

    SIGIR '12
    Sponsor:

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)1
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 16 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2013)Crowdsourcing for information retrieval: introduction to the special issueInformation Retrieval10.1007/s10791-013-9222-716:2(91-100)Online publication date: 26-Mar-2013

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media