Nothing Special   »   [go: up one dir, main page]

Skip to main content

An Impression Evaluation of Robot Facial Expressions Considering Individual Differences by Using Biological Information

  • Conference paper
  • First Online:
Advances in Creativity, Innovation, Entrepreneurship and Communication of Design (AHFE 2021)

Part of the book series: Lecture Notes in Networks and Systems ((LNNS,volume 276))

Included in the following conference series:

  • 2905 Accesses

Abstract

The application of the robot services has been in high demand due to the declining birthrate and aging society. However, one robot’s expression could have different perceptions. Therefore, the purpose of this study is to improve the robot’s impression by estimating human emotions using biological information. Here, a machine learning method was proposed to consider individual differences from the impression evaluation and the combined measurements of electroencephalography (EEG) and pulse rate. This method was evaluated based on three patterns of robot expressions: same as human emotion (synchronize), against human emotion (reverse synchronize), and funny facial expressions. A machine learning method was implied to create a classification model to decide the facial expression of a robot that suits users’ preference. As a result, the individual differences were observed and the machine learning approach reached 80% accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Nikkei Inc.: Fuji Keizai Group Announces Global Market Research Results for Business and Service Robots. https://www.nikkei.com/article/DGXLRSP534619_W0A520C2000000/. (in Japanese)

  2. Ueda, S., Nojima, K., Murakado, C.: Gender differences in influence of facial expression on facial impression judgments. Jpn. J. Cogn. Psychol. 7(2), 103–112 (2010). (in Japanese)

    Google Scholar 

  3. Ministry of Internal Affairs and Communications: Necessity and issues of partner robots. http://www.soumu.go.jp/johotsusintokei/whitepaper/ja/h27/html/nc241350.html. (in Japanese)

  4. Yamano, M., Hashimoto, M., Usui, T.: Evaluation of human-robot interaction method based on emotional synchronization. In: The Proceedings of JSME Annual Conference on Robotics and Mechatronics (Robomec), 1P1-E08 (2009). (in Japanese)

    Google Scholar 

  5. Kurono, Y., Sripian, P., Chen, F., Sugaya, M.: A preliminary experiment on the emotion of emotion using facial expression and biological signals. In: Human-Computer Interaction. Recognition and Interaction Technologies HCII 2019. LNCS, vol. 11567 (2019)

    Google Scholar 

  6. Sripian, P., Kurono, Y., Yoshida, R., Sugaya, M.: Study of empathy on robot expression based on emotion estimated from facial expression and biological signals. In: 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), pp. 1–8. New Delhi, India (2019)

    Google Scholar 

  7. Kajihara, Y., Sripian, P., Sugaya, M.: Proposal of sympathetic robot by sentiment analysis by biological information and synchronization method of facial expression (biometrics). IEICE Tech. Rep. 119(445), 81–86 (2020). (in Japanese)

    Google Scholar 

  8. Ikeda, Y., Horie, R., Sugaya, M.: Estimate emotion with biological information for robot interaction. In: 21st International Conference on Knowledge-Based and Intelligent Information & Engineering Systems (KES-2017), pp. 6–8. Marseille, France (2017)

    Google Scholar 

  9. Egawa, S., Sejima, Y., Sato, Y.: Proposal of an estimation method of emotional centroid based on the Russell’s Circumplex model for quantitative evaluation of affect. Trans. Jpn. Soc. Kansei Eng. 18(3), 187–193 (2019). (in Japanese)

    Google Scholar 

  10. Sales Lecture, Human Brain: First impression questionnaire about sales staff. https://itoshin.jp/contents-enquete-report-201307/. (in Japanese)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Muhammad Nur Adilin Mohd Anuardi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yu, K., Anuardi, M.N.A.M., Sripian, P., Sugaya, M. (2021). An Impression Evaluation of Robot Facial Expressions Considering Individual Differences by Using Biological Information. In: Markopoulos, E., Goonetilleke, R.S., Ho, A.G., Luximon, Y. (eds) Advances in Creativity, Innovation, Entrepreneurship and Communication of Design. AHFE 2021. Lecture Notes in Networks and Systems, vol 276. Springer, Cham. https://doi.org/10.1007/978-3-030-80094-9_61

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-80094-9_61

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-80093-2

  • Online ISBN: 978-3-030-80094-9

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics