Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3524273.3532887acmconferencesArticle/Chapter ViewAbstractPublication PagesmmsysConference Proceedingsconference-collections
research-article
Open access

Huldra: a framework for collecting crowdsourced feedback on multimedia assets

Published: 05 August 2022 Publication History

Abstract

Collecting crowdsourced feedback to evaluate, rank, or score multimedia content can be cumbersome and time-consuming. Most of the existing survey tools are complicated, hard to customize, or tailored for a specific asset type. In this paper, we present an open source framework called Huldra, designed explicitly to address the challenges associated with user studies involving crowdsourced feedback collection. The web-based framework is built in a modular and configurable fashion to allow for the easy adjustment of the user interface (UI) and the multimedia content, while providing integrations with reliable and stable backend solutions to facilitate the collection and analysis of responses. Our proposed framework can be used as an online survey tool by researchers working on different topics such as Machine Learning (ML), audio, image, and video quality assessment, Quality of Experience (QoE), and require user studies for the benchmarking of various types of multimedia content.

References

[1]
2021. HOST-XAI. https://host-xai.herokuapp.com.
[2]
2021. HOST-XAI Tutorial. https://www.youtube.com/watch?v=JJ8uc5gReko/.
[3]
2021. Huldra. https://github.com/simula/huldra.
[4]
2021. Templates - Microworkers - work & earn or offer a micro job. https://www.microworkers.com/.
[5]
2022. Cloud Computing Services - Google Cloud. https://cloud.google.com/.
[6]
2022. Firebase. https://firebase.google.com/.
[7]
2022. Free online meeting scheduling tool - Doodle. https://doodle.com/.
[8]
2022. Google Forms. https://docs.google.com/forms.
[9]
2022. HOST-ATS. https://host-ats.herokuapp.com.
[10]
2022. React - A JavaScript library for building user interfaces. https://reactjs.org/.
[11]
2022. SurveyMonkey - Free online survey software and questionnaire tool. https://www.surveymonkey.com/.
[12]
Amazon. 2021. Amazon Mechanical Turk. https://www.mturk.com
[13]
Sandy J. J. Gould, Anna L. Cox, and Duncan P. Brumby. 2018. Influencing and Measuring Behaviour in Crowdsourced Activities. Springer International Publishing, Cham, 103--130.
[14]
Danna Gurari, Diane Theriault, Mehrnoosh Sameki, Brett Isenberg, Tuan A. Pham, Alberto Purwada, Patricia Solski, Matthew Walker, Chentian Zhang, Joyce Y. Wong, and Margrit Betke. 2015. How to Collect Segmentations for Biomedical Images? A Benchmark Evaluating the Performance of Experts, Crowdsourced Non-experts, and Algorithms. In 2015 IEEE Winter Conference on Applications of Computer Vision. 1169--1176.
[15]
Steven Hicks, Andrea Storås, Michael Riegler, Cise Midoglu, Malek Hammou, Thomas de Lange, Sravanthi Parasa, Pål Halvorsen, and Inga Strümke. 2022. Visual explanations for polyp detection: How medical doctors assess intrinsic versus extrinsic explanations.
[16]
Tobias Hoßfeld, Matthias Hirth, Judith Redi, Filippo Mazza, Pavel Korshunov, Babak Naderi, Michael Seufert, Bruno Gardlo, Sebastian Egger, and Christian Keimel. 2014. Best Practices and Recommendations for Crowdsourced QoE-Lessons learned from the Qualinet Task Force" Crowdsourcing". (2014).
[17]
Andreas Husa, Cise Midoglu, Malek Hammou, Pål Halvorsen, and Michael A. Riegler. 2022. HOST-ATS: Automatic Thumbnail Selection with Dashboard-Controlled ML Pipeline and Dynamic User Survey. In 13th ACM Multimedia Systems Conference (MMSys '22), June 14--17, 2022, Athlone, Ireland. ACM, New York, NY, USA.
[18]
Andreas Husa, Cise Midoglu, Malek Hammou, Steven A. Hicks, Dag Johansen, Tomas Kupka, Michael A. Riegler, and Pål Halvorsen. 2022. Automatic Thumbnail Selection for Soccer Videos using Machine Learning. In 13th ACM Multimedia Systems Conference (MMSys '22), June 14--17, 2022, Athlone, Ireland. ACM, New York, NY, USA.
[19]
O. Iida, M. Urakami, and T. Iwamura. 1993. Applications and evaluation of AI technology in the steel industry. In Proceedings of IEEE 2nd International Workshop on Emerging Technologies and Factory Automation (ETFA '93). 156--163.
[20]
ITU-T Recommendation P.808. 2018. Subjective evaluation of speech quality with a crowdsourcing approach. International Telecommunication Union, Geneva.
[21]
Christian Keimel, Julian Habigt, Clemens Horch, and Klaus Diepold. 2012. QualityCrowd --- A framework for crowd-based quality evaluation. In 2012 Picture Coding Symposium. 245--248.
[22]
Ritu Khare, Benjamin M. Good, Robert Leaman, Andrew I. Su, and Zhiyong Lu. 2016. Crowdsourcing in biomedicine: challenges and opportunities. Briefings in Bioinformatics 17, 1 (01 2016), 23--32. arXiv:https://academic.oup.com/bib/article-pdf/17/1/23/6684984/bbv021.pdf
[23]
Sebastian Kraft and Udo Zölzer. 2014. BeaqleJS: HTML5 and JavaScript based Framework for the Subjective Evaluation of Audio Quality.
[24]
Cise Midoglu, Steven A. Hicks, Vajira Thambawita, Tomas Kupka, and Pål Halvorsen. 2022. MMSys'22 Grand Challenge on AI-based Video Production for Soccer. arXiv:2202.01031 [cs.CV]
[25]
Perry L Miller. 1986. The evaluation of artificial intelligence systems in medicine. Computer methods and programs in biomedicine 22, 1 (1986), 3--11.
[26]
Flavio Protasio Ribeiro, Dinei Florencio, Cha Zhang, and Mike Seltzer. 2011. CROWDMOS: An Approach for Crowdsourcing Mean Opinion Score Studies. In ICASSP (icassp ed.). IEEE. https://www.microsoft.com/en-us/research/publication/crowdmos-an-approach-for-crowdsourcing-mean-opinion-score-studies/
[27]
Michael Riegler, Vamsidhar Reddy Gaddam, Martha Larson, Ragnhild Eg, Pål Halvorsen, and Carsten Griwodz. 2016. Crowdsourcing as self-fulfilling prophecy: Influence of discarding workers in subjective assessment tasks. In 2016 14th International Workshop on Content-Based Multimedia Indexing (CBMI). 1--6.
[28]
Dietmar Saupe, Franz Hahn, Vlad Hosu, Igor Zingman, Masud Rana, and Shujun Li. 2016. Crowd workers proven useful: A comparative study of subjective video quality assessment. In QoMEX 2016: 8th International Conference on Quality of Multimedia Experience.
[29]
Steven Schmidt, Babak Naderi, Saeed Shafiee Sabet, Saman Zadtootaghaj, and Sebastian Möller. 2020. Assessing Interactive Gaming Quality of Experience Using a Crowdsourcing Approach. In 2020 Twelfth International Conference on Quality of Multimedia Experience (QoMEX). IEEE, 1--6.
[30]
Jeremy Wyatt and David Spiegelhalter. 1991. Evaluating Medical Expert Systems: What To Test, And How ?. In Knowledge Based Systems in Medicine: Methods, Applications and Evaluation, Jan L. Talmon and John Fox (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 274--290.

Cited By

View all
  • (2022)Experiences and Lessons Learned from a Crowdsourced-Remote Hybrid User Survey Framework2022 IEEE International Symposium on Multimedia (ISM)10.1109/ISM55400.2022.00035(161-162)Online publication date: Dec-2022

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
MMSys '22: Proceedings of the 13th ACM Multimedia Systems Conference
June 2022
432 pages
ISBN:9781450392839
DOI:10.1145/3524273
This work is licensed under a Creative Commons Attribution International 4.0 License.

Sponsors

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 August 2022

Check for updates

Author Tags

  1. UI
  2. crowdsourced feedback
  3. multimedia content
  4. open source
  5. survey
  6. user study
  7. web application

Qualifiers

  • Research-article

Conference

MMSys '22
Sponsor:
MMSys '22: 13th ACM Multimedia Systems Conference
June 14 - 17, 2022
Athlone, Ireland

Acceptance Rates

Overall Acceptance Rate 176 of 530 submissions, 33%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)181
  • Downloads (Last 6 weeks)17
Reflects downloads up to 26 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2022)Experiences and Lessons Learned from a Crowdsourced-Remote Hybrid User Survey Framework2022 IEEE International Symposium on Multimedia (ISM)10.1109/ISM55400.2022.00035(161-162)Online publication date: Dec-2022

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media