Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article
Open access

Design and impact of hesitation gestures during human-robot resource conflicts

Published: 05 September 2013 Publication History

Abstract

In collaborative tasks, people often communicate using nonverbal gestures to coordinate actions. When two people reach for the same object at the same time, they often respond to an imminent potential collision with jerky halting hand motions that we term hesitation gestures. Successful implementation of such communicative conflict response behaviour onto robots can be useful. In a myriad of human-robot interaction contexts involving shared spaces and objects, this behaviour can provide a fast and effective means for robots to express awareness of conflict and cede right-of-way during collaborative work with users. Our previous work suggests that when a six-degree-of-freedom (6-DOF) robot traces a simplified trajectory of recorded human hesitation gestures, these robot motions are also perceived by humans as hesitation gestures. In this work, we present a characteristic motion profile derived from the recorded human hesitation motions, called the Acceleration-based Hesitation Profile (AHP). We test its efficacy to generate communicative hesitation responses by a robot in a fast-paced human-robot interaction experiment.
Compared to traditional abrupt stopping behaviours, we did not find sufficient evidence that the AHP-based robot responses improve human perception of the robot or human-robot task completion time. However, results from our in situ experiment suggest that subjects can recognize AHP-based robot responses as hesitations and distinguish them to be different from abrupt stopping behaviours.

References

[1]
Argyle, M. (1994). The psychology of interpersonal behaviour (5th ed.). London, UK: Penguin.
[2]
Bartneck, C., Kulić, D., Croft, E., & Zoghbi, S. (2009). Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. International Journal of Social Robotics, 1(1), 71--81.
[3]
Bartneck, C., van der Hoek, M., Mubin, O., & Al Mahmud, A. (2007). "Daisy, daisy, give me your answer do!". New York, NY: ACM Press.
[4]
Becchio, C., Sartori, L., Bulgheroni, M., & Castiello, U. (2008). Both your intention and mine are reflected in the kinematics of my reach-to-grasp movement. Cognition, 106(2), 894--912.
[5]
Becchio, C., Sartori, L., & Castiello, U. (2010). Toward you: The social side of actions. Current Directions in Psychological Science, 19(3), 183--188.
[6]
Bethel, C. L., & Murphy, R. R. (2006). Affective expression in appearance constrained robots. In International Conference on Human-Robot Interaction (p. 327). New York, NY: ACM Press.
[7]
Bratman, M. (1992). Shared cooperative activity. The Philosophical Review, 101(2), 327--341.
[8]
Breazeal, C., Kidd, C., Thomaz, A., Hoffman, G., & Berlin, M. (2005). Effects of nonverbal communication on efficiency and robustness in human-robot teamwork. In International Conference on Intelligent Robots and Systems (pp. 383--388). IEEE/RSJ.
[9]
Burgoon, J. K., Bonito, J. a., Ramirez, A., Dunbar, N. E., Kam, K., & Fischer, J. (2002). Testing the interactivity principle: Effects of mediation, propinquity, and verbal and nonverbal modalities in interpersonal interaction. Journal of Communication, 52(3), 657--677.
[10]
Cohen, P. R., & Levesque, H. J. (1991). Teamwork. Noûs, 25(4), 487--512.
[11]
Dittrich, W. H., & Lea, S. E. (1994). Visual perception of intentional motion. Perception, 23(3), 253--68.
[12]
Doob, L. W. (1990). Hesitation: Impulsivity and reflection. Westport, CT: Greenwood Press.
[13]
Fincannon, T., Barnes, L., Murphy, R., & Riddle, D. (2004). Evidence of the need for social intelligence in rescue robots. In International Conference on Intelligent Robots and Systems (pp. 1089--1095). IEEE/RSJ.
[14]
Flash, T., & Hogan, N. (1985). The coordination of arm movements: Mathematical model. Journal of Neuroscience, 5(7), 1688--1703.
[15]
Fox, R., & McDaniel, C. (1982). The perception of biological motion by human infants. Science, 218(4571), 486--487.
[16]
Goetz, J., Kiesler, S., & Powers, A. (2003). Matching robot appearance and behavior to tasks to improve human-robot cooperation. In International Workshop on Robot and Human Interactive Communication (pp. 55--60). IEEE.
[17]
Goto, A., & Moon, A. (2012). Hesitation - ROS Wiki. Retrieved from http://www.ros.org/wiki/hesitation
[18]
Grosz, B. J. (1996). Collaborative Systems. AI Magazine, 17(2), 67--85.
[19]
Heider, F., & Simmel, M. (1944). An experimental study of apparent behavior. The American Journal of Psychology, 57(2), pp. 243--259. Retrieved from http://www.jstor.org/stable/1416950
[20]
Hinds, P., Roberts, T., & Jones, H. (2004). Whose job is it anyway? A study of human-robot interaction in a collaborative task. Human-Computer Interaction, 19(1), 151--181.
[21]
Holroyd, A., Rich, C., Sidner, C. L., & Ponsler, B. (2011). Generating connection events for human-robot collaboration. In International Symposium on Robot and Human Interactive Communication (pp. 241--246). IEEE.
[22]
Huang, C.-M., & Thomaz, A. L. (2011). Effects of responding to, initiating and ensuring joint attention in human-robot interaction. In International Symposium on Robot and Human Interactive Communication (pp. 65--71). IEEE.
[23]
Ju, W., & Takayama, L. (2009). Approachability: How people interpret automatic door movement as gesture. International Journal of Design, 3(2), 1--10.
[24]
Kazuaki, T., Motoyuki, O., & Natsuki, O. (2010). The hesitation of a robot: A delay in its motion increases learning efficiency and impresses humans as teachable. In International Conference on Human-Robot Interaction (Vol. 8821007, pp. 189--190). Osaka, Japan: IEEE.
[25]
Kim, H., Kwak, S. S., & Kim, M. (2008). Personality design of sociable robots by control of gesture design factors. In International Symposium on Robot and Human Interactive Communication (pp. 494--499). Munich: IEEE.
[26]
Klapp, S. T., Kelly, P. A., & Netick, A. (1987). Hesitations in continuous tracking induced by a concurrent discrete task. Human Factors, 29(3), 327--337.
[27]
Kulić, D., & Croft, E. (2006). Physiological and subjective responses to articulated robot motion. Robotica, 25(01), 13.
[28]
Manera, V., Becchio, C., Cavallo, A., Sartori, L., & Castiello, U. (2011). Cooperation or competition? Discriminating between social intentions by observing prehensile movements. Experimental Brain Research. Experimentelle Hirnforschung. Expérimentation Cérébrale, 211(3-4), 547--56.
[29]
Moon, A., Parker, C. A. C., Croft, E. A., & Van der Loos, H. F. M. (2011). Did you see it hesitate? - Empirically grounded design of hesitation trajectories for collaborative robots. In International Conference on Intelligent Robots and Systems (pp. 1994--1999). San Francisco, CA: IEEE/RSJ.
[30]
Moon, Y., & Nass, C. (1996). How "real" are computer personalities?: Psychological responses to personality types in human-computer interaction. Communication Research, 23(6), 651--674.
[31]
Reed, K., Patton, J., & Peshkin, M. (2007). Replicating Human-Human Physical Interaction. In International conference on robotics and automation (pp. 3615--3620). Rome, Italy: IEEE.
[32]
Reed, K., Peshkin, M., Hartmann, M. J., Grabowecky, M., Patton, J., & Vishton, P. M. (2006, May). Haptically linked dyads: Are two motor-control systems better than one? Psychological Science, 17(5), 365--6.
[33]
Reeves, B., & Nass, C. (1996). The media equation: How people treat computers, television, and new media like real people and places. Cambridge, England: Cambridge University Press.
[34]
Riek, L. D., Rabinowitch, T., Bremner, P., Pipe, A. G., Fraser, M., & Robinson, P. (2010). Cooperative gestures: Effective signaling for humanoid robots. In International Conference on Human-Robot Interaction (pp. 61--68). Osaka, Japan: ACM/IEEE.
[35]
Saerbeck, M., & Bartneck, C. (2010). Perception of affect elicited by robot motion. In International Conference on Human-Robot Interaction (pp. 53--60). New York, NY: ACM/IEEE.
[36]
Salem, M., Rohlfing, K., Kopp, S., & Joublin, F. (2011). A friendly gesture: Investigating the effect of multimodal robot behavior in human-robot interaction. In International Symposium on Robot and Human Interactive Communication (pp. 247--252). IEEE.
[37]
Thorisson, K. R., & Cassell, J. (1999). The power of a nod and a glance: Envelope vs. emotional feedback in animated conversational agents. Applied Artificial Intelligence, 13(4-5), 519--538.
[38]
Tomasello, M., Carpenter, M., Call, J., Behne, T., & Moll, H. (2005). Understanding and sharing intentions: The origins of cultural cognition. Behavioral and Brain Sciences, 28(5), 675--91; discussion 691--735.

Cited By

View all
  • (2024)Robot-Assisted Decision-Making: Unveiling the Role of Uncertainty Visualisation and EmbodimentProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642911(1-16)Online publication date: 11-May-2024
  • (2024)DRAWBOT: Making Everyday Objects InteractiveCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3640607(554-558)Online publication date: 11-Mar-2024
  • (2024)Power in Human-Robot InteractionProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634949(269-282)Online publication date: 11-Mar-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Journal of Human-Robot Interaction
Journal of Human-Robot Interaction  Volume 2, Issue 3
September 2013
109 pages

Publisher

Journal of Human-Robot Interaction Steering Committee

Publication History

Published: 05 September 2013

Author Tags

  1. collision avoidance
  2. hesitation
  3. human-robot interaction
  4. nonverbal communication
  5. reaching motions
  6. resource conflict
  7. trajectory design

Qualifiers

  • Research-article

Funding Sources

  • Natural Sciences and Engineering Research Council of Canada
  • Canada Foundation for Innovation
  • UBC Institute for Computing, Information and Cognitive Systems

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)104
  • Downloads (Last 6 weeks)10
Reflects downloads up to 18 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Robot-Assisted Decision-Making: Unveiling the Role of Uncertainty Visualisation and EmbodimentProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642911(1-16)Online publication date: 11-May-2024
  • (2024)DRAWBOT: Making Everyday Objects InteractiveCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3640607(554-558)Online publication date: 11-Mar-2024
  • (2024)Power in Human-Robot InteractionProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634949(269-282)Online publication date: 11-Mar-2024
  • (2023)Nonverbal Cues in Human–Robot Interaction: A Communication Studies PerspectiveACM Transactions on Human-Robot Interaction10.1145/357016912:2(1-21)Online publication date: 15-Mar-2023
  • (2021)Back-offACM Transactions on Human-Robot Interaction10.1145/341830310:3(1-25)Online publication date: 11-Jul-2021
  • (2021)Design of Hesitation Gestures for Nonverbal Human-Robot Negotiation of ConflictsACM Transactions on Human-Robot Interaction10.1145/341830210:3(1-25)Online publication date: 11-Jul-2021
  • (2021)Interactive Vignettes: Enabling Large-Scale Interactive HRI Research2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN)10.1109/RO-MAN50785.2021.9515376(1289-1296)Online publication date: 8-Aug-2021
  • (2021)Maintaining efficient collaboration with trust-seeking robots2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)10.1109/IROS.2016.7759510(3312-3319)Online publication date: 11-Mar-2021
  • (2019)Spiking Neural Networks for early prediction in human–robot collaborationInternational Journal of Robotics Research10.1177/027836491987225238:14(1619-1643)Online publication date: 1-Dec-2019
  • (2015)Tap and pushJournal of Human-Robot Interaction10.5555/3109835.31098414:1(95-113)Online publication date: 22-Jul-2015
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media