Crowdsourcing for multimedia exploits a combination of human intelligence and a large number of individual human contributions. In the field of multimedia, crowdsourcing is just beginning to realize its full potential to contribute to techniques, systems and data sets that advance the state of the art. The ACM Multimedia 2013 Workshop on Crowdsourcing for Multimedia (CrowdMM 2013) provides a forum for the presentation of crowdsourcing techniques for multimedia and the discussion of innovative ideas that will allow the field of multimedia research to make use of crowdsourcing.
The workshop uses presentations, posters and a panel to promote discussion and exchange between researchers concerning the scope and future of crowdsourcing. The workshop defines crowdsourcing in the broad sense: it encompasses both unsolicited human contributions, e.g., tags assigned by users to images, and also solicited contributions, e.g., annotations gathered by making use of crowdsourcing platforms that micro-outsource tasks to a large pool of human workers. As in 2012, the workshop pursues two high-level goals. First, it provides a venue to encourage multimedia research making use of human intelligence and taking advantage of human plurality. Second, the workshop gives an impetus to the multimedia community to define best practices for the use of crowdsourcing in multimedia and to shape emerging paradigms that will allow crowdsourcing to push forward the state of the art in multimedia research.
Proceeding Downloads
1000 songs for emotional analysis of music
Music is composed to be emotionally expressive, and emotional associations provide an especially natural domain for indexing and recommendation in today's vast digital music libraries. But such libraries require powerful automated tools, and the ...
Crowdsourcing for affective-interaction in computer games
Affective-interaction in computer games is a novel area with several new challenges, such as detecting players' facial expressions (e.g., happy, sad, surprise) in a robust manner. In this paper we describe a crowdsourcing effort for creating the ground-...
How do users make a people-centric slideshow?
This paper presents a pilot user study that attempts to shed light on the ways users create people-centric slideshows, with the objective of scaling it up to a crowdsourcing experiment. The study focuses on two major directions, namely image selection ...
Crowdsourced object segmentation with a game
We introduce a new algorithm for image segmentation based on crowdsourcing through a game : Ask'nSeek. The game provides information on the objects of an image, under the form of clicks that are either on the object, or on the back-ground. These logs ...
Divide and conquer: atomizing and parallelizing a task in a mobile crowdsourcing platform
In this paper we present some conclusions about the advantages of having an efficient task formulation when a crowdsourcing platform is used. In particular we show how the task atomization and distribution can help to obtain results in an efficient way. ...
Assessing internet video quality using crowdsourcing
In this paper, we present a subjective video quality evaluation system that has been integrated with different crowdsourcing platforms. We try to evaluate the feasibility of replacing the time consuming and expensive traditional tests with a faster and ...
Crowdsourcing-based multimedia subjective evaluations: a case study on image recognizability and aesthetic appeal
Research on Quality of Experience (QoE) heavily relies on subjective evaluations of media. An important aspect of QoE concerns modeling and quantifying the subjective notions of 'beauty' (aesthetic appeal) and 'something well-known' (content ...
Crowdsourced evaluation of the perceived viewing quality in user-generated video
Recent research to measure the viewing quality (or QoE) of user-generated video focuses on transmission phenomena, e.g., stalling, or the effects of transcoding resulting from using compression algorithms. Our motivation is that current models and ...
Index Terms
- Proceedings of the 2nd ACM international workshop on Crowdsourcing for multimedia
Recommendations
Acceptance Rates
Year | Submitted | Accepted | Rate |
---|---|---|---|
CrowdMM '14 | 26 | 8 | 31% |
CrowdMM '13 | 16 | 8 | 50% |
Overall | 42 | 16 | 38% |