Nothing Special   »   [go: up one dir, main page]

skip to main content
article
Free access

A proactive approach of robotic framework for making eye contact with humans

Published: 01 January 2014 Publication History

Abstract

Making eye contact is a most important prerequisite function of humans to initiate a conversation with others. However, it is not an easy task for a robot to make eye contact with a human if they are not facing each other initially or the human is intensely engaged his/her task. If the robot would like to start communication with a particular person, it should turn its gaze to that person and make eye contact with him/her. However, such a turning action alone is not enough to set up an eye contact phenomenon in all cases. Therefore, the robot should perform some stronger actions in some situations so that it can attract the target person before meeting his/her gaze. In this paper, we proposed a conceptual model of eye contact for social robots consisting of two phases: capturing attention and ensuring the attention capture. Evaluation experiments with human participants reveal the effectiveness of the proposed model in four viewing situations, namely, central field of view, near peripheral field of view, far peripheral field of view, and out of field of view.

References

[1]
M. Argyle, Bodily Communication, Routledge, London, UK, 1988.
[2]
T. Striano, V. M. Reid, and S. Hoehl, "Neural mechanisms of joint attention in infancy," European Journal of Neuroscience, vol. 23, no. 10, pp. 2819-2823, 2006.
[3]
T. Farroni, E. M. Mansfield, C. Lai, and M. H. Johnson, "Infants perceiving and acting on the eyes: tests of an evolutionary hypothesis," Journal of Experimental Child Psychology, vol. 85, no. 3, pp. 199-212, 2003.
[4]
C. Trevarthen and K. J. Aitken, "Infant intersubjectivity: research, theory, and clinical applications," Journal of Child Psychology and Psychiatry and Allied Disciplines, vol. 42, no. 1, pp. 3-48, 2001.
[5]
C. L. Kleinke, "Gaze and eye contact: a research review," Psychological Bulletin, vol. 100, no. 1, pp. 78-100, 1986.
[6]
C. Fullwood and G. Doherty-Sneddon, "Effect of gazing at the camera during a video link on recall," Applied Ergonomics, vol. 37, no. 2, pp. 167-175, 2006.
[7]
E. Goffman, Behaviour in Public Places: Notes on the Social Organization of Gatherings, The Free Press, New York, NY, USA, 1st edition, 1963.
[8]
A. M. Sabelli, T. Kanda, and N. Hagita, "A conversational robot in an elderly care center: an ethnographic study," in Proceedings of the 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI'11), pp. 37-44, Lausanne, Switzerland, March 2011.
[9]
B. Mutlu, "The design of gaze behavior for embodied social interfaces," in Proceedings of the 28th Annual CHI Conference on Human Factors in Computing Systems, pp. 2661-2664, Florence, Italy, April 2008.
[10]
S. E. Brennan and E. A. Hulteen, "Interaction and feedback in a spoken language system: a theoretical framework," Knowledge-Based Systems, vol. 8, no. 2-3, pp. 143-151, 1995.
[11]
C. Peters, "Direction of attention perception for conversation initiation in virtual environments," in Intelligent Virtual Agents, vol. 3661 of Lecture Notes in Computer Science, pp. 215-228, Springer, London, UK, 2005.
[12]
S. R. H. Langten, R. J. Watt, and V. Bruce, "Do the eyes have it? Cues to the direction of social attention," Trends in Cognitive Sciences, vol. 4, no. 2, pp. 50-59, 2000.
[13]
J. E. Hanna and S. E. Brennan, "Speakers' eye gaze disambiguates referring expressions early during face-to-face conversation," Journal of Memory and Language, vol. 57, no. 4, pp. 596-615, 2007.
[14]
M. Cranach, "The role of orienting behavior in human interaction, behavior and environment," in The Use of Space by Animals and Men, H. Esser, Ed., pp. 217-237, Plenum Press, New York, NY, USA, 1971.
[15]
Y. Yoshikawa, K. Shinozawa, H. Ishiguro, N. Hagita, and T. Miyamoto, "The effects of responsive eye movement and blinking behavior in a communication robot," in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'06), pp. 4564-4569, Beijing, China, October 2006.
[16]
M. Argyle and M. Cook, Gaze and Mutual Gaze, Cambridge University Press, Oxford, UK, 1976.
[17]
M. Pilu, "On the use of attention clues for an autonomous wearable camera," HP Technical Report HPL-2002-195R1, 2003.
[18]
J. M. Loomis, J. W. Kelly, M. Pusch, J. N. Bailenson, and A. C. Beall, "Psychophysics of perceiving eye-gaze and head direction with peripheral vision: implications for the dynamics of eye-gaze behavior," Perception, vol. 37, no. 9, pp. 1443-1457, 2008.
[19]
N. J. Emery, "The eyes have it: the neuroethology, function and evolution of social gaze," Neuroscience and Biobehavioral Reviews, vol. 24, no. 6, pp. 581-604, 2000.
[20]
I. Perrett and J. Emery, "Understanding the intentions of others from visual signals: neurophysiological evidence," Journal of Current Psychology of Cognition, vol. 13, no. 5, pp. 683-694, 1994.
[21]
C. Ware, Information Visualization: Perception for Design, Morgan Kaufmann, San Francisco, Calif, USA, 2004.
[22]
M. M. Hoque, D. Das, T. Onuki, Y. Kobayashi, and Y. Kuno, "Robotic system controlling target human's attention," in Intelligent Computing Theories and Applications: Proceedings of the 8th International Conference, ICIC 2012, Huangshan, China, July 25-29, 2012, vol. 7390 of Lecture Notes in Computer Science, pp. 534-544, Springer, Berlin, Germany, 2012.
[23]
T. Hall, The Hidden Dimension: Man's Use of Space in Public and Private, The Bodley Head, 1966.
[24]
A. Kendon, "Features of the structural analysis of human communicational behavior," in Aspects of Nonverbal Communication, pp. 29-43, Swets & Zeitlinger, Lisse, The Netherlands, 1980.
[25]
A. Kendon, Spatial Organization in Social Encounters: The F-Formation System, Cambridge University Press, New York, NY, USA, 1990.
[26]
C. Breazeal, Designing Social Robots, The MIT Press, Cambridge, Mass, USA, 2002.
[27]
H. Kozima and H. Yano, "In search of otogenetic prerequisites for embodied social intelligent," in Proceedings of the International Conference on Cognitive Science: Workshop on Emergence and Development of Embodied Cognition, pp. 30-34, Beijing, China, August 2001.
[28]
B. Scassellati, Foundations for a theory of mind for a humanoid robot [Ph.D. thesis], Department of Electrical Engineering and Computer Science, MIT, Cambridge, Mass, USA, 2001.
[29]
H. Kozima and H. Yano, "A robot that leans to communicate with human caregivers," in Proceedings of the 1st International Workshop on Epigenetic Robotics, Lund, Sweden, September 2001.
[30]
M. Fujita and H. Kitano, "Development of an autonomous quadruped robot for robot entertainment," Autonomous Robots, vol. 5, no. 1, pp. 7-18, 1998.
[31]
T. Tashima, S. Saito, T. Kudo, M. Osumi, and T. Shibata, "Interactive pet robot with an emotion model," Advanced Robotics, vol. 13, no. 3, pp. 225-226, 1998.
[32]
J. Pineau, M. Montemerlo, M. Pollack, N. Roy, and S. Thrun, "Towards robotic assistants in nursing homes: challenges and results," Robotics and Autonomous Systems, vol. 42, no. 3-4, pp. 271-281, 2003.
[33]
K. Severinson-Eklundh, A. Green, and H. Hüttenrauch, "Social and collaborative aspects of interaction with a service robot," Robotics and Autonomous Systems, vol. 42, no. 3-4, pp. 223-234, 2003.
[34]
E. A. Sisbot, R. Alami, T. Simeon et al., "Navigation in the presence of humans," in Proceedings of the 5th IEEE-RAS International Conference on Humanoid Robots, pp. 181-188, Tsukuba, Japan, December 2005.
[35]
E. A. Sisbot, A. Clodic, L. F. Marin U, M. Fontmarty, L. Brèthes, and R. Alami, "Implementing a human-aware robot system," in Proceedings of the 15th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN'06), pp. 727-732, Hatfield, UK, September 2006.
[36]
E. Pacchierotti, H. I. Christensen, and P. Jensfelt, "Design of an office-guide robot for social interaction studies," in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'06), pp. 4965-4970, Beijing, China, October 2006.
[37]
M. Bennewitz, W. Burgard, G. Cielniak, and S. Thrun, "Learning motion patterns of people for compliant robot motion," International Journal of Robotics Research, vol. 24, no. 1, pp. 31-48, 2005.
[38]
R. Gockley, A. Bruce, J. Forlizzi et al., "Designing robots for long-term social interaction," in Proceedings of the IEEE IRS/RSJ International Conference on Intelligent Robots and Systems (IROS'05), pp. 1338-1343, Beijing, China, August 2005.
[39]
K. Hayashi, D. Sakamoto, T. Kanda et al., "Humanoid robots as a passive-social medium: a field experiment at a train station," in Proceedings of the ACM/IEEE Conference on Human-Robot Interaction--Robot as Team Member (HRI'07), pp. 137-144, Arlington, Va, USA, March 2007.
[40]
S. Li, B. Wrede, and G. Sagerer, "A dialog system for comparative user studies on robot verbal behavior," in Proceedings of the 15th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN'06), pp. 129-134, Hatfield, UK, September 2006.
[41]
M. P. Michalowski, S. Sabanovic, and R. Simmons, "A spatial model of engagement for a social robot," in Proceedings of the 9th IEEE International Workshop on Advanced Motion Control, pp. 762-767, Istanbul, Turkey, March 2006.
[42]
M. Yamamoto and T. Watanabe, "Timing control effects of utterance to communicative actions on embodied interaction with a robot," in Proceedings of the 13th IEEE International Workshop on Robot and Human Interactive Communication (RO-MAN'04), pp. 467-472, Okayama, Japan, September 2004.
[43]
N. Bergström, T. Kanda, T. Miyashita, H. Ishiguro, and N. Hagita, "Modeling of natural human-robot encounters," in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'08), pp. 2623-2629, Nice, France, September 2008.
[44]
B. Schauerte, J. Richarz, and G. A. Fink, "Saliency-based identification and recognition of pointed-at objects," in Proceedings of the 23rd IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'10), pp. 4638-4643, Taipei, Taiwan, October 2010.
[45]
K. Yamazaki, M. Kawashima, Y. Kuno et al., "Prior-to-request and request behaviors within elderly day care: implications for developing service robots for use in multiparty settings," in Proceedings of the 10th European Conference on Computer Supported Cooperative Work (ECSCW'07), pp. 61-78, Limerick, Ireland, September 2007.
[46]
H. Kuzuoka, Y. Suzuki, J. Yamashita, and K. Yamazaki, "Reconfiguring spatial formation arrangement by robot body orientation," in Proceedings of the 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI'10), pp. 285-292, Osaka, Japan, March 2010.
[47]
K. Dautenhahn, M. Walters, S. Woods et al., "How may I serve you? A robot companion approaching a seated person in a helping context," in Proceedings of the ACM SIGCHI/SIGART Conference on Human-Robot Interaction (HRI'06), pp. 172-179, Salt Lake City, Utah, USA, March 2006.
[48]
F. Yamaoka, T. Kanda, H. Ishiguro, and N. Hagita, "A model of proximity control for information-presenting robots," IEEE Transactions on Robotics, vol. 26, no. 1, pp. 187-195, 2010.
[49]
R. Gockley, J. Forlizzi, and R. Simmons, "Natural person-following behavior for social robots," in Proceedings of the ACM/IEEE Conference on Human-Robot Interaction--Robot as Team Member (HRI'07), pp. 17-24, Arlington, Va, USA, March 2007.
[50]
M. Buss, D. Carton, B. Gonsior et al., "Towards proactive human-robot interaction in human environments," in Proceedings of the 2nd International Conference on Cognitive Infocommunications (CogInfoCom'11), Budapest, Hungary, July 2011.
[51]
A. Cesta, G. Cortellessa, M. Vittoria Giuliani, F. Pecora, M. Scopelliti, and L. Tiberio, "Psychological implications of domestic assistive technology for the elderly," PsychNology Journal, vol. 5, no. 3, pp. 229-252, 2007.
[52]
M. M. Hoque, T. Onuki, E. Tsuburaya et al., "An empirical framework to control human attention by robot," in Computer Visin--ACCV 2010 Workshops, Part I, Eds., vol. 6468 of Lecture Notes in Computer Science, pp. 430-439, Springer, London, UK, 2011.
[53]
C. Shi, M. Shimada, T. Kanda, H. Ishiguro, and N. Hagita, "Spatial formation model for initiating conversation," in Proceedings of the International Conference on Robotics: Science and Systems, Los Angeles, Calif, USA, June 2011.
[54]
S. Satake, T. Kanda, D. F. Glas, M. Imai, H. Ishiguro, and N. Hagita, "How to approach humans? Strategies for social robots to initiate interaction," in Proceedings of the 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI'09), pp. 109-116, La Jolla, Calif, USA, March 2009.
[55]
N. Mitsunaga, T. Miyashita, H. Ishiguro, K. Kogure, and N. Hagita, "Robovie-IV: a communication robot interacting with people daily in an office," in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'06), pp. 5066-5072, Beijing, China, October 2006.
[56]
P. Althaus, H. Ishiguro, T. Kanda, T. Miyashita, and H. I. Christensen, "Navigation for human-robot interaction tasks," in Proceedings of the IEEE International Conference on Robotics and Automation, vol. 2, pp. 1894-1900, Barcelona, Spain, April-May 2004.
[57]
E. A. Sisbot, K. F. Marin-Urias, R. Alami, and T. Siméon, "A human aware mobile robot motion planner," IEEE Transactions on Robotics, vol. 23, no. 5, pp. 874-883, 2007.
[58]
A. Brooks, C. Breazeal, M. Marjanovic, B. Scassellati, and M. Williamson, "The cog project: building a humanoid robot," in Computation for Metaphors, Analogy, and Agents, vol. 1562 of Lecture Notes in Computer Science pp. 52-87, Springer, Berlin, Germany, 1999.
[59]
T. Kanda, H. Ishiguro, M. Imai, and T. Ono, "Development and evaluation of interactive humanoid robots," Proceedings of the IEEE, vol. 92, no. 11, pp. 1839-1850, 2004.
[60]
Y. Matsusaka, T. Tojo, S. Kubota et al., "Multi-person conversation via multimodal interface--a robot who communicate with multi-user," in Proceedings of the European Conference on Speech Communication and Technology, pp. 1723-1726, Budapest, Hungary, 2012.
[61]
B. Mutlu, T. Kanda, J. Forlizzi, J. Hodgins, and H. Ishiguro, "Conversational gaze mechanisms for humanlike robots," ACM Transactions on Interactive Intelligent Systems, vol. 1, no. 2, article 12, 2012.
[62]
M. Shiomi, T. Kanda, N. Miralles et al., "Face-to-face interactive humanoid robot," in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'04), pp. 1340-1346, Sendai, Japan, September-October 2004.
[63]
T. Yonezawa, H. Yamazoe, A. Utsumi, and S. Abe, "Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking," in Proceedings of the 9th International Conference on Multimodal Interfaces (ICMI'07), pp. 140-145, Nagoya, Japan, November 2007.
[64]
B. Mutlu, T. Kanda, J. Forlizzi, J. Hodgins, and H. Ishiguro, "Conversational gaze mechanisms for humanlike robots," ACM Transactions on Interactive Intelligent Systems, vol. 1, no. 2, article 12, 2012.
[65]
D. Das, Y. Kobayashi, and Y. Kuno, "Attracting attention and establishing a communication channel based on the level of visual focus of attention," in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2194-2201, Tokyo, Japan, November 2013.
[66]
D. Miyauchi, A. Nakamura, and Y. Kuno, "Bidirectional eye contact for human-robot communication," IEICE Transactions on Information and Systems, vol. 88, no. 11, pp. 2509-2516, 2005.
[67]
D. Maurer, Infant's Perception of Facedness, Ablex, Norwood, NJ, USA, 1985.
[68]
Y. Yoshikawa, K. Shinozawa, H. Ishiguro, N. Hagita, and T. Miyamoto, "Responsive robot gaze to interaction partner," in Proceedings of the International Conference on Robotics: Science and Systems, pp. 457-462, Philadelphia, Pa, USA, 2006.
[69]
C.-M. Huang and A. L. Thomaz, "Effects of responding to, initiating and ensuring joint attention in human-robot interaction," in Proceedings of the 20th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN'11), pp. 65-71, Atlanta, Ga, USA, August 2011.
[70]
B. Miller, "Response time in man-computer conversational transactions," in Proceedings of the American Federation of Information Processing Societies, Fall Joint Computer Conference (AFIPS'68), pp. 267-277, San Francisco, Calif, USA, December 1968.
[71]
B. Most and J. Simons, "Attention capture, orienting, and awareness," in Attraction, Distraction and Action; Multiple Perspectives on Attentional Capture, C. Folk and B. Gibson, Eds., vol. 133, of Advances in Psychology pp. 151-173, Elsevier, New York, NY, USA, 2001.
[72]
M. Finke, K. L. Koay, K. Dautenhahn, C. L. Nehaniv, M. L. Walters, and J. Saunders, "Hey, I'm over here--how can a robot attract people's attention?," in Proceedings of the 14th IEEE Workshop on Robot and Human Interactive Communication (RO-MAN'05), pp. 7-12, August 2005.
[73]
W. James, The Principle of Psychology, Dover, New York, NY, USA, 1950.
[74]
K. Takashima, Y. Omori, Y. Yoshimoto, Y. Itoh, Y. Kitamura, and F. Kishino, "Effects of avatar's blinking animation on person impressions," in Proceedings of the International Conference on Graphics Interface, pp. 169-176, Toronto, Canada, May 2008.
[75]
M. Cook and J. M. C. Smith, "The role of gaze in impression formation," The British Journal of Social and Clinical Psychology, vol. 14, no. 1, pp. 19-25, 1975.
[76]
R. V. Exline, "Multichannel transmission of nonverbal behavior and the perception of powerful men: the presidential debates of 1976," in Power, Dominance, and Nonverbal Behavior, Springer Series in Social Psychology, pp. 183-206, Springer, New York, NY, USA, 1985.
[77]
A. Fukayama, T. Ohno, N. Mukawa, M. Sawaki, and N. Hagita, "Messages embedded in gaze of interface agents--impression management with agent's gaze," in Proceedings of the International Conference on Human Factors in Computing Systems (SIGCHI'02), pp. 41-48, Minneapolis, Minn, USA, April 2002.
[78]
L. Itti, N. Dhavale, and F. Pighin, "Realistic avatar eye and head animation using a neurobiological model of visual attention," in 6th Applications and Science of Neural Networks, Fuzzy Systems, and Evolutionary Computation, vol. 64 of Proceedings of SPIE, pp. 64-78, San Diego, Calif, USA, August 2003.
[79]
M. J. Doughty, "Further assessment of gender and blink pattern-related differences in the spontaneous eyeblink activity in primary gaze in young adult humans," Journal of Optometry and Vision Science, vol. 79, no. 7, pp. 439-447, 2002.
[80]
J. de Greeff, F. Delaunay, and T. Belpaeme, "Towards retro-projected robot faces: an alternative to mechatronic and android faces," in Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive (RO-MAN'09), pp. 306-311, Toyama, Japan, October 2009.
[81]
Y. Kobayashi and Y. Kuno, "People tracking using integrated sensors for human robot interaction," in Proceedings of the IEEE-ICIT International Conference on Industrial Technology (ICIT'10), pp. 1617-1622, Viña del Mar, Chile, March 2010.
[82]
M. Isard and A. Blake, "Condensation-conditional density propagation for visual tracking," International Journal of Computer Vision, vol. 29, no. 1, pp. 5-28, 1998.
[83]
FaceAPI, "Face Tracking for OEM Product Development," Seeing Machines Int., 2010, http://www.faceapi.com.
[84]
G. Bradsky, A. Kaehler, and V. Pisarevsky, "Learning based computer vision with Intel's open computer vision library," Intel Technology Journal, vol. 9, no. 1, pp. 119-130, 2005.
[85]
P. E. Downing, C. M. Dodds, and D. Bray, "Why does the gaze of others direct visual attention?," Visual Cognition, vol. 11, no. 1, pp. 71-79, 2004.
[86]
M. Argyle and J. Dean, "Eye-contact, distance and affiliation," Sociometry, vol. 28, pp. 289-304, 1965.
[87]
S. Andrist, X. Z. Tan, M. Gleicher, and B. Mutlu, "Conversational gaze aversion for humanlike robots," in Proceedings of the 9th ACM/IEEE International Conference on Human Robot Interaction, pp. 25-32, Bielefeld, Germany, March 2014.
[88]
M. Cranach and J. Ellgring, The Perception of Looking Behaviour, Academic Press, London, UK, 1973.
[89]
N. J. Emery, "The eyes have it: the neuroethology, function and evolution of social gaze," Neuroscience and Biobehavioral Reviews, vol. 24, no. 6, pp. 581-604, 2000.
[90]
A. Abele, "Functions of gaze in social interaction: communication and monitoring," Journal of Nonverbal Behavior, vol. 10, no. 2, pp. 83-101, 1986.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Advances in Human-Computer Interaction
Advances in Human-Computer Interaction  Volume 2014, Issue
January 2014
212 pages
ISSN:1687-5893
EISSN:1687-5907
Issue’s Table of Contents

Publisher

Hindawi Limited

London, United Kingdom

Publication History

Accepted: 27 May 2014
Revised: 15 May 2014
Received: 20 January 2014
Published: 01 January 2014

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 2,913
    Total Downloads
  • Downloads (Last 12 months)2,875
  • Downloads (Last 6 weeks)327
Reflects downloads up to 22 Nov 2024

Other Metrics

Citations

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media