Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2207676.2208296acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Unlocking the expressivity of point lights

Published: 05 May 2012 Publication History

Abstract

Small point lights (e.g., LEDs) are used as indicators in a wide variety of devices today, from digital watches and toasters, to washing machines and desktop computers. Although exceedingly simple in their output - varying light intensity over time - their design space can be rich. Unfortunately, a survey of contemporary uses revealed that the vocabulary of lighting expression in popular use today is small, fairly unimaginative, and generally ambiguous in meaning. In this paper, we work through a structured design process that points the way towards a much richer set of expressive forms and more effective communication for this very simple medium. In this process, we make use of five different data gathering and evaluation components to leverage the knowledge, opinions and expertise of people outside our team. Our work starts by considering what information is typically conveyed in this medium. We go on to consider potential expressive forms -- how information might be conveyed. We iteratively refine and expand these sets, concluding with ideas gathered from a panel of designers. Our final step was to make use of thousands of human judgments, gathered in a crowd-sourced fashion (265 participants), to measure the suitability of different expressive forms for conveying different information content. This results in a set of recommended light behaviors that mobile devices, such as smartphones, could readily employ.

Supplementary Material

MOV File (paperfile689-3.mov)
Supplemental video for “Unlocking the expressivity of point lights”

References

[1]
Amazon Mechanical Turk. http://www.mturk.com
[2]
Android API, "Adding flashing lights." Retrieved September 13, 2011. http://developer.android.com/ guide/topics/ui/notifiers/notifications.html#Lights.
[3]
Arons, B. SpeechSkimmer: interactively skimming recorded speech. In Proc. UIST '93, 187--196.
[4]
Baumann, K. and Thomas, B. (2001). User Interface Design of Electronic Appliances, 1st Ed. CRC Press.
[5]
Blattner, M. M., Sumikawa, D. A., and Greenberg, R. M. Earcons and icons: their structure and common design principles. Human-Comp. Inter. 4, 1 (1989), 11--44.
[6]
Brewster, S. and Brown, L. M. Tactons: structured tactile messages for non-visual information display. In Proc. Australasian User Interface '04. 15--23.
[7]
Brewster, S. A., Wright, P. C. and Edwards, A. An evaluation of earcons for use in auditory humancomputer interfaces. In Proc. CHI '93. 222--227.
[8]
Chen, C., Forlizzi, J., and Jennings, P. ComSlipper: an expressive design to support awareness and availability. In CHI '06 Ext. Abs. 369--374.
[9]
Dahley, A., Wisneski, C., and Ishii, H. Water lamp and pinwheels: ambient projection of digital information into architectural space. In Proc. CHI '98. 269--270.
[10]
Easterby, R. The Perception of Symbols for Machine Displays. Ergonomics 13, 1 (1970), 149--158.
[11]
Enriquez, M. J. and MacLean, K. E. The Hapticon Editor: A Tool in Support of Haptic Communication Research. In Proc. HAPTICS '03. 356--362.
[12]
Gaver, W. Auditory Icons: Using Sound in Compute Interfaces. Human-Comp. Interac. 2, 2 (1986), 167--177.
[13]
Gaver, W. The Sonic Finder: An Interface that Uses Auditory Icons. Human-Computer Interaction, 4, 1 (1989), 67--94.
[14]
Hankinson, J. C. and Edwards, A. D. Designing earcons with musical grammars. SIGCAPH Comput. Phys. Handicap. 65 (Sep. 1999), 16--20.
[15]
Harrison, C. and Hudson, S. E. Texture displays: a passive approach to tactile presentation. In Proc. CHI '09. 2261--2264.
[16]
Harrison, C., Hsieh, G., Willis, K. D. D., Forlizzi, J. and Hudson, S. E. Kineticons: using iconographic motion in graphical user interface design. In Proc. CHI '11. 1999--2008
[17]
Holmquist L. E., and Skog, T. Informative art: information visualization in everyday environments. In Proc. GRAPHITE '03. 229--235.
[18]
Huppi, B. Q., Stringer, C. J., Bell, J. and Capener, C. J. United States Patent 6658577: Breathing status LED indicator, 2003.
[19]
Kittur, A., Chi, E. H., and Suh, B. Crowdsourcing user studies with Mechanical Turk. In Proc. CHI '08. 453--456.
[20]
Lee, S. H., and Blake, R. Visual form created solely from temporal structure. Science, 284 (1999). 11651168.
[21]
Lim, B. Y. Shick, A., Harrison, C., and Hudson, S. E. Pediluma: motivating physical activity through contextual information and social influence. In Proc. TEI '11. 173--180.
[22]
Lodding, K. Iconic Interfacing. IEEE Computer Graphics and Applications. 3, 2 (1983), 11--20.
[23]
MacIntyre, B., Mynatt, E. D., Voida, S., Hansen, K. M, Tullio, J., Corso, G. M., Support for multitasking and background awareness using interactive peripheral displays. In Proc. UIST '01. 41--50.
[24]
Mankoff, J., Dey, A. K., Hsieh, G., Kientz, J., Lederer, S. and Ames. M. Heuristic evaluation of ambient displays. In Proc. CHI '03. 169--176.
[25]
Pintus, A. V. Tangible lightscapes. In Proc. TEI '10. 379--380.
[26]
Pintus, A. V. A collection of light behaviours (video). Retrieved Sept. 7, 2011. http://vimeo.com/8612242
[27]
Rzeszotarski, J. and Kittur, A., Instrumenting the crowd: Using implicit behavioral measures to predict task performance. In Proc. UIST '11. 13--22.
[28]
Seitinger, S., Taub, D. M. and Taylor, A. S. Light bodies: exploring interactions with responsive lights. In Proc. TEI '10. 113--120.
[29]
Weiser, M. and Brown, J. S. 1997. The coming age of calm technology. In Beyond Calculation: The Next Fifty Years of Computing. Denning P. J. and Metcalfe, R. M. (eds), Springer-Verlag. 75--85.
[30]
Wessolek, D. 2008. Semiotic Foundations of Illuminants as Time-Based Medium in Space: Experiments and Artifacts. MFA Thesis, Bauhaus University Weimar, Germany.
[31]
Wessolek, D. Bouncing glow: methods of creating content elements for one-pixel-displays. In Proc. TEI '11. 443--444.
[32]
Williams, A., Farnham, S., and Counts, S. Exploring wearable ambient displays for social awareness. In CHI '06 Ext. Abs. 1529--1534.

Cited By

View all
  • (2024)Impacts of Robot Beep Timings on Trust Dynamics in Human-Robot InteractionInternational Journal of Social Robotics10.1007/s12369-024-01181-7Online publication date: 24-Oct-2024
  • (2023)Can You Ear Me?Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36109257:3(1-23)Online publication date: 27-Sep-2023
  • (2023)Exploring the Potential of eHMIs as Traffic Light AlternativesAdjunct Proceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3581961.3609885(99-104)Online publication date: 18-Sep-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI '12: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
May 2012
3276 pages
ISBN:9781450310154
DOI:10.1145/2207676
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 May 2012

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. expressive
  2. indicator lights
  3. led
  4. light behavior
  5. notification

Qualifiers

  • Research-article

Conference

CHI '12
Sponsor:

Acceptance Rates

Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

Upcoming Conference

CHI '25
CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)82
  • Downloads (Last 6 weeks)7
Reflects downloads up to 25 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Impacts of Robot Beep Timings on Trust Dynamics in Human-Robot InteractionInternational Journal of Social Robotics10.1007/s12369-024-01181-7Online publication date: 24-Oct-2024
  • (2023)Can You Ear Me?Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36109257:3(1-23)Online publication date: 27-Sep-2023
  • (2023)Exploring the Potential of eHMIs as Traffic Light AlternativesAdjunct Proceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3581961.3609885(99-104)Online publication date: 18-Sep-2023
  • (2023)Exploring Feedback Modality Designs to Improve Young Children's Collaborative ActionsProceedings of the 25th International Conference on Multimodal Interaction10.1145/3577190.3614140(271-281)Online publication date: 9-Oct-2023
  • (2023)Integrating Robot Manufacturer Perspectives into Legible Factory Robot Light CommunicationsACM Transactions on Human-Robot Interaction10.1145/357073212:1(1-33)Online publication date: 15-Feb-2023
  • (2023)At First Light: Expressive Lights in Support of Drone-Initiated CommunicationProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581062(1-17)Online publication date: 19-Apr-2023
  • (2023)Feel for me! Robot’s Reactions to Abuse Influence Humans’ Empathy2023 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)10.1109/RO-MAN57019.2023.10309616(1667-1674)Online publication date: 28-Aug-2023
  • (2023)Using Colour and Brightness for Sound Zone FeedbackHuman-Computer Interaction – INTERACT 202310.1007/978-3-031-42280-5_15(247-272)Online publication date: 25-Aug-2023
  • (2023)Exploring the Recommendation Expressions of Multiple Robots Towards Single-Operator-Multiple-Robots TeleoperationHuman-Computer Interaction10.1007/978-3-031-35602-5_4(46-60)Online publication date: 9-Jul-2023
  • (2022)Effects of Colored LEDs in Robotic Storytelling on Storytelling Experience and Robot PerceptionProceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction10.5555/3523760.3523934(1053-1058)Online publication date: 7-Mar-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media