Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/1178745.1178747acmconferencesArticle/Chapter ViewAbstractPublication PagesmmConference Proceedingsconference-collections
Article

Human-centered collaborative interaction

Published: 27 October 2006 Publication History

Abstract

Recent years have witnessed an increasing shift in interest from single user multimedia/multimodal interfaces towards support for interaction among groups of people working closely together, e.g. during meetings or problem solving sessions. However, the introduction of technology to support collaborative practices has not been devoid of problems. It is not uncommon that technology meant to support collaboration may introduce disruptions and reduce group effectiveness.Human-centered multimedia and multimodal approaches hold a promise of providing substantially enhanced user experiences by focusing attention on human perceptual and motor capabilities, and on actual user practices. In this paper we examine the problem of providing effective support for collaboration, focusing on the role of human-centered approaches that take advantage of multimodality and multimedia. We show illustrative examples that demonstrate human-centered multimodal and multimedia solutions that provide mechanisms for dealing with the intrinsic complexity of human-human interaction support.

References

[1]
R. Anderson, C. Hoyer, C. Prince, J. Su, F. Videon, and S. Wolfman. Speech, ink and slides: The interaction of content channels. In ACM Multimedia, 2004.
[2]
R. J. Anderson, R. Anderson, C. Hoyer, and S. A. Wolfman. A study of digital ink in lecture presentation. In CHI 2004: The 2004 Conference on Human Factors in Computing Systems, Vienna, Austria, 2004.
[3]
Anoto Corporation. Anoto technology - how does it work? http: //www.anotofunctionality.com/cldoc/aof3.htm, May 2006.
[4]
A. Baddeley. Working memory. Science, 255:556--559, 1992.
[5]
P. Barthelmess, E. Kaiser, X. Huang, and D. Demirdjian. Distributed pointing for multimodal collaboration over sketched diagrams. In Proceedings of the International Conference on Multimodal Interfaces (ICMI), New York, NY, USA, October 2005. ACM Press.
[6]
P. Barthelmess, E. Kaiser, X. Huang, D. McGee, and P. Cohen. Collaborative multimodal photo annotation over digital paper. In Proceedings of the International Conference on Multimodal Interfaces (ICMI). ACM Press, 2006.
[7]
P. Cohen, M. Johnston, D. McGee, S. Oviatt, J. Pittman, I. Smith, L. Chen, and J. Clow. Quickset: Multimodal interaction for distributed applications. In Proceedings of the Fifth ACM International Multimedia Conference, 1997.
[8]
P. R. Cohen and D. R. McGee. Tangible multimodal interfaces for safety-critical applications. Communications of the ACM, 47(1):41--46, 2004.
[9]
P. R. Cohen, D. R. McGee, and J. Clow. The efficiency of multimodal interaction for a map-based task. In Proceedings of the sixth conference on Applied natural language processing, pages 331--338, San Francisco, CA, USA, 2000. Morgan Kaufmann Publishers Inc.
[10]
P. Dietz and D. Leigh. Diamondtouch: a multi-user touch technology. In UIST '01: Proceedings of the 14th annual ACM symposium on User interface software and technology, pages 219--226, New York, NY, USA, 2001. ACM Press.
[11]
J. Geissler. Shuffle, throw or take it! working efficiently with an interactive wall. In CHI '98: CHI 98 conference summary on Human factors in computing systems, pages 265--266, New York, NY, USA, 1998. ACM Press.
[12]
D. Gergle, R. Kraut, and S. Fussel. Language efficiency and visual technology: Minimizing collaborative effort with visual information. Journal of Language and Social Psychology, 23(4):491--517, 2004.
[13]
J. Grudin. Why cscw applications fail: problems in the design and evaluation of organization of organizational interfaces. In CSCW '88: Proceedings of the 1988 ACM conference on Computer-supported cooperative work, pages 85--93, New York, NY, USA, 1988. ACM Press.
[14]
J. Grudin. The computer reaches out: The historical continuity of interface design. In Human Factors in Computer Systems (CHI), pages 261--268, April 1990.
[15]
F. Guimbretiére, M. Stone, and T. Winograd. Fluid interaction with high-resolution wall-size displays. In UIST '01: Proceedings of the 14th annual ACM symposium on User interface software and technology, pages 21--30, New York, NY, USA, 2001. ACM Press
[16]
A. Gupta and T. Anastasakos. Integration patterns during multimodal interaction. In International Conference on Spoken Language Processing - INTERSPEECH, pages 2293--2296, 2004.
[17]
R. R. Hoffman, A. Roesler, and B. M. Moon. What is design in the context of human-centered computing? IEEE Intelligent Systems, 19(4):89--95, 2004.
[18]
A. Jaimes. Human-centered multimedia: Culture, deployment, and access. IEEE MultiMedia, 13(1):12--19, 2006.
[19]
E. Kaiser. Shacer: a speech and handwriting recognizer. In Proceedings of the International Conference on Multimodal Interfaces (ICMI), New York, NY, USA, October 2005. ACM Press. Workshop on Multimodal, Multiparty Meeting Processing.
[20]
E. Kaiser. Using redundant speech and handwriting for learning new vocabulary and understanding abbreviations. In Proceedings of the International Conference on Multimodal Interfaces (ICMI). ACM Press, 2006.
[21]
E. Kaiser and P. Barthelmess. Edge-splitting in a cumulative multimodal system, for a no-wait temporal threshold on information fusion combined with an under-specified display. In Proceedings Interspeech 2006 - ICSLP), 2006.
[22]
E. Kaiser, D. Demirdjian, A. Gruenstein, X. Li, J. Niekrasz, M. Wesson, and S. Kumar. Demo: A multimodal learning interface for sketch, speak and point creation of a schedule chart. In Proceedings of the International Conference on Multimodal Interfaces (ICMI), pages 329--330, New York, NY, USA, October 2004. ACM Press.
[23]
E. C. Kaiser. Dynamic new vocabulary enrollment through handwriting and speech in a multimodal scheduling application. In Making Pen-Based Interaction Intelligent and Natural, Papers from the 2004 AAAI Symposium, pages 85--91, Arlington, VA., USA, 2004. Technical Report FS-04-06.
[24]
E. C. Kaiser. Multimodal new vocabulary recognition through speech and handwriting in a whiteboard scheduling application. In Proceedings of the International Conference on Intelligent User Interfaces, pages 51--58, 2005.
[25]
M. Katzenmaier, R. Stiefelhagen, and T. Schultz. Identifying the addressee in human-human-robot interactions based on head pose and speech. In Proceedings of the International Conference on Multimodal Interfaces (ICMI), pages 144--151, New York, NY, USA, October 2004. ACM Press.
[26]
R. Lunsford, E. Kaiser, P. Barthelmess, and X. Huang. Managing extrinsic costs via multimodal natural interaction systems. In CHI'06 Workshop: What is the Next Generation of Human-Computer Interaction?, 2006. www.eecs.tufts. edu/~jacob/workshop/papers/lunsford.pdf.
[27]
R. Lunsford and S. Oviatt. Toward open-microphone engagement for multiparty field interactions. In Proceedings of the International Conference on Multimodal Interfaces (ICMI), New York, NY, USA, October 2006. ACM Press.
[28]
D. McGee and P. Cohen. Creating tangible interfaces by augmenting physical objects with multimodal language. In Proceedings of the International Conference on Intelligent User Interfaces (IUI 2001), 2001.
[29]
D. R. McGee, P. R. Cohen, R. M. Wesson, and S. Horman. Comparing paper and tangible, multimodal tools. In CHI '02: Proceedings of the SIGCHI conference on Human factors in computing systems, pages 407--414, New York, NY, USA, 2002. ACM Press.
[30]
W. J. Orlikowski. Learning from notes: organizational issues in groupware implementation. In CSCW '92: Proceedings of the 1992 ACM conference on Computer-supported cooperative work, pages 362--369, New York, NY, USA, 1992. ACM Press.
[31]
S. Oviatt. Multimodal interfaces. In J. Jacko and A. Sears, editors, The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications, chapter 14, pages 286--304. Lawrence Erlbaum Assoc., Mahwah, NJ, 2003.
[32]
S. Oviatt, A. Arthur, and J. Cohen. Quiet interfaces that help students think. In Proceedings of the 19th Annual ACM Symposium on User Interface Software and Technology (UIST'06), 2006.
[33]
S. Oviatt, R. Coulston, and R. Lunsford. When do we interact multimodally?: cognitive load and multimodal communication patterns. In Proceedings of the International Conference on Multimodal Interfaces (ICMI), pages 129--136, New York, NY, USA, October 2004. ACM Press.
[34]
S. Oviatt, A. DeAngeli, and K. Kuhn. Integration and synchronization of input modes during multimodal human-computer interaction. In Proceedings of Conference on Human Factors in Computing Systems: CHI '97, New York:, 1997. ACM Press.
[35]
S. Oviatt and E. Olsen. Integration themes in multimodal human-computer interaction. In International ConferenceonSpoken Language Processing (ICSLP '94), pages 551--554, 1994.
[36]
A. Paivio. Mental representations: A dual coding approach, volume 9 of Oxford Physchology Series. Oxford University Press, 1990.
[37]
J. Rekimoto and N.Matshushita. Toward a human and object sensitive interactive display. In Perceptual User Interfaces (PUI), 1997.
[38]
A. J. Sellen and R. H. Harper. The Myth of the Paperless Office. MIT Press, Cambridge, MA, USA, 2003.
[39]
M. G. Shafto and R. R. Hoffman. Guest editors' introduction: Human-centered computing at nasa. IEEE Intelligent Systems, 17(5):10--13, 2002.
[40]
J. Sweller. Cognitive load during problem solving: Effects on learning. Cognitive Science: A Multidisciplinary Journal, 12(2):257--285, 1988.
[41]
N. Talbert. Toward human-centered systems. IEEE Comput. Graph. Appl., 17(4):21--28, 1997.
[42]
J. C. Tang. Findings from observational studies of collaborative work. Int. J. Man-Mach. Stud., 34(2):143--160, 1991.
[43]
K. van Turnhout, J. Terken, I. Bakx, and B. Eggen. Identifying the intended addressee in mixed human-human and human-computer interaction from non-verbal features. In Proceedings of the International Conference on Multimodal Interfaces (ICMI), pages 175--182, New York, NY, USA, October 2005. ACM Press.

Cited By

View all
  • (2021)Jammify: Interactive Multi-sensory System for Digital Art JammingHuman-Computer-Interaction – INTERACT 202110.1007/978-3-030-85607-6_2(23-41)Online publication date: 27-Aug-2021
  • (2015)Enhancement of interactivity using a new lecturing system(IELS)2015 Science and Information Conference (SAI)10.1109/SAI.2015.7237236(801-806)Online publication date: Jul-2015
  • (2013)Enhancing mobile phones to support collaborative communication for micro-entrepreneurs in emerging economiesProceedings of the 11th Asia Pacific Conference on Computer Human Interaction10.1145/2525194.2525212(142-149)Online publication date: 24-Sep-2013
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
HCM '06: Proceedings of the 1st ACM international workshop on Human-centered multimedia
October 2006
138 pages
ISBN:1595935002
DOI:10.1145/1178745
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 27 October 2006

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. design
  2. guidelines
  3. human-centered systems
  4. multimedia
  5. multimodal systems

Qualifiers

  • Article

Conference

MM06
MM06: The 14th ACM International Conference on Multimedia 2006
October 27, 2006
California, Santa Barbara, USA

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)23
  • Downloads (Last 6 weeks)2
Reflects downloads up to 25 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2021)Jammify: Interactive Multi-sensory System for Digital Art JammingHuman-Computer-Interaction – INTERACT 202110.1007/978-3-030-85607-6_2(23-41)Online publication date: 27-Aug-2021
  • (2015)Enhancement of interactivity using a new lecturing system(IELS)2015 Science and Information Conference (SAI)10.1109/SAI.2015.7237236(801-806)Online publication date: Jul-2015
  • (2013)Enhancing mobile phones to support collaborative communication for micro-entrepreneurs in emerging economiesProceedings of the 11th Asia Pacific Conference on Computer Human Interaction10.1145/2525194.2525212(142-149)Online publication date: 24-Sep-2013
  • (2012)Collaborative museumsProceedings of the ACM 2012 conference on Computer Supported Cooperative Work10.1145/2145204.2145307(681-684)Online publication date: 11-Feb-2012
  • (2010)The Collaboration PlatformProceedings of the 2010 Eighth International Conference on Creating, Connecting and Collaborating through Computing10.1109/C5.2010.13(19-25)Online publication date: 25-Jan-2010
  • (2009)A Session Engine Approach for Synchronous Collaborative EnvironmentsProceedings of the 2009 Seventh International Conference on Creating, Connecting and Collaborating through Computing10.1109/C5.2009.20(144-150)Online publication date: 19-Jan-2009
  • (2009)Exploring Multimodal Interaction in Collaborative SettingsProceedings of the 13th International Conference on Human-Computer Interaction. Part II: Novel Interaction Methods and Techniques10.1007/978-3-642-02577-8_3(19-28)Online publication date: 14-Jul-2009

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media