Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2835966.2835974acmotherconferencesArticle/Chapter ViewAbstractPublication PagesindiahciConference Proceedingsconference-collections
short-paper

Mobile Interfaces for Crowdsourced Multimedia Microtasks

Published: 17 December 2015 Publication History

Abstract

Crowd sourced mobile microtasking represents a significant opportunity in emerging economies such as India, that are characterized by the high levels of mobile phone penetration and large numbers of educated people that are unemployed or underemployed. Indeed, mobile phones have been used successfully in many parts of the world for microtasking, primarily for crowd sourced data collection, and text or image based tasks. More complex tasks such as annotation of multimedia such as audio or video have traditionally been confined to desktop interfaces. With the rapid evolution in the multimedia capabilities of mobile phones in these geographies, we believe that the nature of microtasks carried out on these devices, as well as the design of interfaces for such microtasks, warrants investigation.
In this paper we explore the design of mobile phone interfaces for a set of multimedia-based microtasks on feature phones, which represent the vast majority of multimedia-capable mobile phones in these geographies. As part of an initial study using paper prototypes, we evaluate three types of multimedia content: images, audio and video, and three interfaces for data input: Direct Entry, Scroll Key Input and Key Mapping. We observe that while there are clear interface preferences for image and audio tasks, the user preference for video tasks varies based on the 'task complexity' - the 'density' of data the annotator has to deal with. In a second study, we prototype two different interfaces for video-based annotation tasks - a single screen input method, and a two screen phased interface. We evaluate the two interface designs and the three data input methods studied earlier by means of a user study with 36 participants. Our findings show that where less dense data was concerned; participants prefer Key Mapping as the input technique. For dense data, while participants prefer Key Mapping, our data shows that the accuracy of data input with Key Mapping is significantly lower than that with Scroll Key Input. The study also provides insight into the game plan each user develops and employs to input data. We believe these findingswill enable other researchers to build effective user interfaces for mobile microtasks, and be of value to UI developers, HCI researchers and microtask designers.

References

[1]
Amazon Mechanical Turk. https://www.mturk.com/mturk/welcome.
[2]
Alt, F., Shirazi, A.S., Schmidt, A., Kramer, U., and Nawaz, Z., Location-based crowdsourcing: extending crowdsourcing to the real world. In Proc of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries (NordiCHI '10). ACM, New York, NY, USA
[3]
Arnowitz, J., Arent, M., Berger, N., (2007) Effective Prototyping for software makers. Morgan Kaufmann
[4]
Bernstein, M.S., Brandt, J., Miller, R.C., Karger, D.R., Crowds in Two Seconds: Enabling Realtime Crowd-Powered Interfaces. In Proc of the 24th annual ACM symposium on User interface software and technology (UIST '11). ACM, New York, NY, USA
[5]
Coursaris, C.K., Kim, J.M., A Meta-Analytical review of empirical mobile usability studies. Journal of Usability Studies (2011), Vol. 6, Issue 3, 117--171.
[6]
http://www.crowd-cloud.org/
[7]
http://www.crowdflower.com/
[8]
Eagle, N. txteagle: Mobile Crowdsourcing, MIT Media Laboratory and The Santa Fe Institute. 2009
[9]
Gawade, M., Exploring Employment Opportunities through Microtasks via Cybercafes, GHTC 2012.
[10]
Gupta, A., Thies, W., Cutrell, E., Balakrishnan, R., mClerk: Enabling Mobile Crowdsourcing in Developing Regions. In Proc of the 2012 ACM annual conf on Human Factors in Computing Systems (CHI '12). ACM, New York, NY, USA,
[11]
Kumar, A. et al., Wallah: Design and Evaluation of a Task-centric Mobile-based Crowdsourcing, MOBIQUITOUS 2014, London
[12]
Liu, Y., Alexandrova, T., Nakajima, T., Lehdonvirta, V., Mobile Image Search via Local Crowd: a User Study,In Proc of the 2011 IEEE 17th International Conference on Embedded and Real-Time Computing Systems and Applications - Volume 02 (RTCSA '11), Vol. 2. IEEE Computer Society, Washington, DC, USA, 109--112
[13]
MobileWorks, http://www.mobileworks.com/
[14]
Okolloh, O. Ushahidi or 'testimony': Web 2.0 tools for crowdsourcing crisis information. In Participatory Learning and Action, No. 59, 2009.
[15]
Rogstadius, J., Kostakos, V., Towards Real-time Emergency Response using Crowd Supported Analysis of Social Media, CHI 2011, May 7--12, 2011, Vancouver, BC, Canada.
[16]
Thies, W., Ratan, A., and Davis, J. Paid Crowdsourcing as a Vehicle for Global Development. CHI Workshop on Crowdsourcing and Human Computation, (2011).
[17]
T. Yan, M. Marzilli, et al.,mCrowd: a platform for mobile crowdsourcing. ACM SenSys'09.
[18]
Yan, T., Kumar, V., Ganesan, D., CrowdSearch: Exploiting Crowds for Accurate Real-time Image Search on Mobile Phones, In Proceedings of the 8th international conference on Mobile systems, applications, and services (MobiSys '10). ACM, New York, NY, USA
[19]
http://articles.economictimes.indiatimes.com/2015-02-03/news/58751662_1_networking-index-mobile-users-population

Cited By

View all
  • (2021)Usability Evaluation of Cultural Heritage Crowdsourcing System (CHCS)Digital Literacy and Socio-Cultural Acceptance of ICT in Developing Countries10.1007/978-3-030-61089-0_17(273-290)Online publication date: 1-Jun-2021

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
IndiaHCI '15: Proceedings of the 7th Indian Conference on Human-Computer Interaction
December 2015
182 pages
ISBN:9781450340533
DOI:10.1145/2835966
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

In-Cooperation

  • Tata Consultancy Services
  • Microsoft Research: Microsoft Research

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 17 December 2015

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Mobile crowdsourcing
  2. feature phones
  3. micro-tasking
  4. mobile UI
  5. multimedia
  6. task complexity

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Conference

IndiaHCI'15

Acceptance Rates

Overall Acceptance Rate 33 of 93 submissions, 35%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)5
  • Downloads (Last 6 weeks)2
Reflects downloads up to 19 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2021)Usability Evaluation of Cultural Heritage Crowdsourcing System (CHCS)Digital Literacy and Socio-Cultural Acceptance of ICT in Developing Countries10.1007/978-3-030-61089-0_17(273-290)Online publication date: 1-Jun-2021

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media