default search action
Michiya Yamamoto
Person information
Refine list
refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
2020 – today
- 2024
- [c66]Kosei Inoue, Saizo Aoyagi, Satoshi Fukumori, Michiya Yamamoto, Yukie Isaka, Katsuya Kitade:
Development of an Eye-Movement Training System that Can Be Easily Used by Students Alone in Special Instructional Classrooms. HCI (8) 2024: 59-71 - [c65]Akari Kubota, Sota Fujiwara, Satoshi Fukumori, Saizo Aoyagi, Michiya Yamamoto:
Emotion Estimation Using Laban Feature Values Based on a Multi-scale Kinesphere. SUI 2024: 36:1-36:3 - [c64]Kenjiro Okada, Saizo Aoyagi, Satoshi Fukumori, Michiya Yamamoto, Hiroyuki Abutani:
An Eye Tracking Concussion Assessment System with Integrated MR-based Sports Vision Training. SUI 2024: 57:1-57:3 - 2023
- [c63]Mamoru Hiroe, Michiya Yamamoto, Takashi Nagamatsu:
Implicit User Calibration for Gaze-tracking Systems Using Saliency Maps Filtered by Eye Movements. ETRA 2023: 62:1-62:5 - [c62]Sota Fujiwara, Fumiya Kobayashi, Saizo Aoyagi, Michiya Yamamoto:
Features Focusing on the Direction of Body Movement in Emotion Estimation Based on Laban Movement Analysis. HCI (2) 2023: 182-195 - [c61]Seishiro Hara, Ryoya Fujii, Saizo Aoyagi, Michiya Yamamoto:
Analysis of Effects of Raggedy Student CG Characters in Face-to-Face Lectures and Their On-Demand Streaming. HCI (6) 2023: 235-251 - [c60]Hayate Yamada, Fumiya Kobayashi, Saizo Aoyagi, Michiya Yamamoto:
A Study on Eye-Region Expression as Naturally Expressed in a Situation of Spontaneously Evoked Emotions. HCI (43) 2023: 303-309 - 2022
- [c59]Hong Zhang, Haruka Shoda, Saizo Aoyagi, Michiya Yamamoto:
A Study on the Back and Forth Manzai of Milkboy by Focusing on Embodied Motions and Actions for Liven-Up. HCI (45) 2022: 89-103 - 2021
- [c58]Saizo Aoyagi, Yoshihiro Sejima, Michiya Yamamoto:
A Robot that Tells You It is Watching You with Its Eyes. HCI (2) 2021: 201-215 - [c57]Ryoya Fujii, Hayato Hirose, Saizo Aoyagi, Michiya Yamamoto:
On-Demand Lectures that Enable Students to Feel the Sense of a Classroom with Students Who Learn Together. HCI (4) 2021: 268-282 - [c56]Fumiya Kobayashi, Masashi Sugimoto, Saizo Aoyagi, Michiya Yamamoto, Noriko Nagata:
Modeling Salesclerks' Utterances in Bespoke Scenes and Evaluating Them Using a Communication Robot. HCI (44) 2021: 271-278 - 2020
- [c55]Saizo Aoyagi, Satoshi Fukumori, Michiya Yamamoto:
A Long-Term Evaluation of Social Robot Impression. HCI (5) 2020: 131-144 - [c54]Ryosuke Kita, Michiya Yamamoto, Katsuya Kitade:
Development of a Vision Training System Using an Eye Tracker by Analyzing Users' Eye Movements. HCI (45) 2020: 371-382 - [c53]Kodai Obata, Masashi Sugimoto, Saki Miyai, Yoichi Yamazaki, Fan Zhang, Michiya Yamamoto, Noriko Nagata:
Construction of customers' emotion model in the bespoke tailoring using evaluation grid method. ICCE 2020: 1-4
2010 – 2019
- 2019
- [c52]Masashi Sugimoto, Yoichi Yamazaki, Fang Zhang, Saki Miyai, Kodai Obata, Michiya Yamamoto, Noriko Nagata:
Differences in Customers' Interactions with Expert/Novice Salesclerks in a Bespoke Tailoring Situation: A Case Study on the Utterances of Salesclerks. HCI (35) 2019: 131-137 - [c51]Hayato Hirose, Ken Minamide, Satoshi Fukumori, Saizo Aoyagi, Michiya Yamamoto:
Development of a System for Analyzing Hand-Raising Communication by Using a VR Headset to Synthesize Human-CG Character Interaction. HCI (5) 2019: 141-150 - [c50]Yuki Ono, Saizo Aoyagi, Masashi Sugimoto, Yoichi Yamazaki, Michiya Yamamoto, Noriko Nagata:
Application of Classification Method of Emotional Expression Type Based on Laban Movement Analysis to Design Creation. HCI (2) 2019: 143-154 - [c49]Risa Muraya, Noriko Suzuki, Mamiko Sakata, Michiya Yamamoto:
The Creative Power of Collaborative Pairs in Divergent Idea-Generation Task. HCI (4) 2019: 330-342 - [c48]Saizo Aoyagi, Atsuko Tanaka, Satoshi Fukumori, Michiya Yamamoto:
VR system to simulate tightrope walking with a standalone VR headset and slack rails. VR 2019: 1293-1294 - 2018
- [c47]Mamoru Hiroe, Michiya Yamamoto, Takashi Nagamatsu:
Implicit user calibration for gaze-tracking systems using an averaged saliency map around the optical axis of the eye. ETRA 2018: 56:1-56:5 - [c46]Michiya Yamamoto, Ryoma Matsuo, Satoshi Fukumori, Takashi Nagamatsu:
Modeling corneal reflection for eye-tracking considering eyelid occlusion. ETRA 2018: 95:1-95:3 - [c45]Noriko Suzuki, Mayuka Imashiro, Haruka Shoda, Noriko Ito, Mamiko Sakata, Michiya Yamamoto:
Effects of Group Size on Performance and Member Satisfaction. HCI (5) 2018: 191-199 - [c44]Ken Minamide, Satoshi Fukumori, Saizo Aoyagi, Michiya Yamamoto:
Development of a Pair Ski Jump System Focusing on Improvement of Experience of Video Content. HCI (4) 2018: 562-571 - 2017
- [c43]Yoichi Yamazaki, Michiya Yamamoto, Noriko Nagata:
Estimation of Emotional State in Personal Fabrication: Analysis of Emotional Motion Based on Laban Movement Analysis. Culture Computing 2017: 71-74 - [c42]Ryoma Matsuo, Haruka Sugimoto, Mamiko Sakata, Michiya Yamamoto:
A Study on Extracting Attractive Regions from One-Point Perspective Paintings. HCI (1) 2017: 496-505 - [c41]Noriko Suzuki, Mayuka Imashiro, Mamiko Sakata, Michiya Yamamoto:
The Effects of Group Size in the Furniture Assembly Task. HCI (4) 2017: 623-632 - [c40]Michiya Yamamoto, Saizo Aoyagi, Satoshi Fukumori, Tomio Watanabe:
Development of a Communication Robot for Forwarding a User's Presence to a Partner During Video Communication. HCI (3) 2017: 640-649 - [c39]Michiya Yamamoto, Hirofumi Sakiyama, Satoshi Fukumori, Takashi Nagamatsu:
An unobservable and untraceable input method for public spaces by reconstructing points of gaze only on servers. SUI 2017: 155 - 2016
- [c38]Daiki Sakai, Michiya Yamamoto, Takashi Nagamatsu, Satoshi Fukumori:
Enter your PIN code securely!: utilization of personal difference of angle kappa. ETRA 2016: 317-318 - [c37]Kazuaki Tanaka, Michiya Yamamoto, Saizo Aoyagi, Noriko Nagata:
An Affect Extraction Method in Personal Fabrication Based on Laban Movement Analysis. HCI (27) 2016: 188-193 - [c36]Saizo Aoyagi, Michiya Yamamoto, Satoshi Fukumori:
Analysis of Hand Raising Actions for Group Interaction Enhancement. HCI (4) 2016: 321-328 - 2015
- [c35]Saizo Aoyagi, Ryuji Kawabe, Michiya Yamamoto, Tomio Watanabe:
Hand-Raising Robot for Promoting Active Participation in Classrooms. HCI (5) 2015: 275-284 - [c34]Mamiko Sakata, Noriko Suzuki, Kana Shirai, Haruka Shoda, Michiya Yamamoto, Takeshi Sugio:
How Do Japanese People Return a Greeting with a Bow? HCI (3) 2015: 503-513 - [c33]Daiki Sakai, Michiya Yamamoto, Takashi Nagamatsu:
Framework for Realizing a Free-Target Eye-tracking System. IUI Companion 2015: 73-76 - [c32]Michiya Yamamoto, Saizo Aoyagi, Satoshi Fukumori, Tomio Watanabe:
KiroPi: A life-log robot by installing embodied hardware on a tablet. RO-MAN 2015: 258-263 - 2014
- [c31]Ryuji Kawabe, Michiya Yamamoto, Saizo Aoyagi, Tomio Watanabe:
Measurement of Hand Raising Actions to Support Students' Active Participation in Class. HCI (12) 2014: 199-207 - [c30]Takashi Nagamatsu, Kaoruko Fukuda, Michiya Yamamoto:
Development of Corneal Reflection-based Gaze Tracking System for Public Use. PerDis 2014: 194-195 - [c29]Takashi Nagamatsu, Michiya Yamamoto, Gerhard Rigoll:
Simulator for developing gaze sensitive environment using corneal reflection-based remote gaze tracker. SUI 2014: 142 - 2013
- [c28]Hiroki Kanegae, Masaru Yamane, Michiya Yamamoto, Tomio Watanabe:
Effects of a Communication with Make-Believe Play in a Real-Space Sharing Edutainment System. HCI (15) 2013: 326-335 - [c27]Michiya Yamamoto, Hironobu Nakagawa, Koichi Egawa, Takashi Nagamatsu:
Development of a Mobile Tablet PC with Gaze-Tracking Function. HCI (14) 2013: 421-429 - [c26]Takuya Matsumoto, Ryota Tamura, Michiya Yamamoto, Tomio Watanabe:
Development of a life-log robot for supporting group interaction in everyday life. RO-MAN 2013: 216-219 - [c25]Koichi Egawa, Hiroaki Takai, Michiya Yamamoto, Takashi Nagamatsu:
Eye-tracking volume simulation method to configure hardware settings for tangible and multi-user tabletop interaction. ITS 2013: 369-372 - 2012
- [j2]Takashi Nagamatsu, Yukina Iwamoto, Ryuichi Sugano, Junzo Kamahara, Naoki Tanaka, Michiya Yamamoto:
Gaze Estimation Method Involving Corneal Reflection-Based Modeling of the Eye as a General Surface of Revolution about the Optical Axis of the Eye. IEICE Trans. Inf. Syst. 95-D(6): 1656-1667 (2012) - [c24]Takashi Nagamatsu, Michiya Yamamoto, Ryuichi Sugano, Junzo Kamahara:
Mathematical model for wide range gaze tracking system based on corneal reflections and pupil using stereo cameras. ETRA 2012: 257-260 - [c23]Michiya Yamamoto, Munehiro Komeda, Takashi Nagamatsu, Tomio Watanabe:
Development of a gaze-and-touch algorithm for a tabletop Hyakunin-Isshu game with a computer opponent. RO-MAN 2012: 203-208 - [c22]Michiya Yamamoto, Yusuke Shigeno, Ryuji Kawabe, Tomio Watanabe:
Development of a context-enhancing surface based on the entrainment of embodied rhythms and actions sharing via interaction. ITS 2012: 363-366 - 2011
- [c21]Kentaro Okamoto, Michiya Yamamoto, Tomio Watanabe:
A Configuration Method of Visual Media by Using Characters of Audiences for Embodied Sport Cheering. HCI (2) 2011: 585-592 - [c20]Yuya Takao, Michiya Yamamoto, Tomio Watanabe:
Development of Embodied Visual Effects Which Expand the Presentation Motion of Emphasis and Indication. HCI (2) 2011: 603-612 - [c19]Michiya Yamamoto, Hiroshi Sato, Keisuke Yoshida, Takashi Nagamatsu, Tomio Watanabe:
Development of an Eye-Tracking Pen Display for Analyzing Embodied Interaction. HCI (11) 2011: 651-658 - [c18]Michiya Yamamoto, Munehiro Komeda, Takashi Nagamatsu, Tomio Watanabe:
Hyakunin-Eyesshu: a tabletop Hyakunin-Isshu game with computer opponent by the action prediction based on gaze detection. NGCA 2011: 5 - [c17]Yusuke Shigeno, Michiya Yamamoto, Tomio Watanabe:
Analysis of pointing motions by introducing a joint model for supporting embodied large-surface presentation. ITS 2011: 250-251 - 2010
- [c16]Takashi Nagamatsu, Michiya Yamamoto, Hiroshi Sato:
MobiGaze: development of a gaze interface for handheld mobile devices. CHI Extended Abstracts 2010: 3349-3354 - [c15]Michiya Yamamoto, Takashi Nagamatsu, Tomio Watanabe:
Development of eye-tracking pen display based on stereo bright pupil technique. ETRA 2010: 165-168 - [c14]Takashi Nagamatsu, Yukina Iwamoto, Junzo Kamahara, Naoki Tanaka, Michiya Yamamoto:
Gaze estimation method based on an aspherical model of the cornea: surface of revolution about the optical axis of the eye. ETRA 2010: 255-258 - [c13]Michiya Yamamoto, Kouzi Osaki, Shotaro Matsune, Tomio Watanabe:
An embodied entrainment character cell phone by speech and head motion inputs. RO-MAN 2010: 298-303 - [c12]Michiya Yamamoto, Munehiro Komeda, Takashi Nagamatsu, Tomio Watanabe:
Development of eye-tracking tabletop interface for media art works. ITS 2010: 295-296
2000 – 2009
- 2009
- [c11]Michiya Yamamoto, Kouzi Osaki, Tomio Watanabe:
Video Content Production Support System with Speech-Driven Embodied Entrainment Character by Speech and Hand Motion Inputs. HCI (3) 2009: 358-367 - 2008
- [j1]Michiya Yamamoto, Tomio Watanabe:
Effects of Time Lag of Utterances to Communicative Actions on Embodied Interaction With Robot and CG Character. Int. J. Hum. Comput. Interact. 24(1): 87-107 (2008) - [c10]Yoshihiro Sejima, Tomio Watanabe, Michiya Yamamoto:
Analysis by Synthesis of Embodied Communication via VirtualActor with a Nodding Response Model. ISUC 2008: 225-230 - [c9]Michiya Yamamoto, Tomio Watanabe:
Development of an edutainment system with interactors of a teacher and a student in which a user plays a double role of them. RO-MAN 2008: 659-664 - 2007
- [c8]Michiya Yamamoto, Tomio Watanabe:
Development of an Embodied Image Telecasting Method Via a Robot with Speech-Driven Nodding Response. HCI (8) 2007: 1017-1025 - [c7]Kouzi Osaki, Tomio Watanabe, Michiya Yamamoto:
Speech-driven embodied entrainment character system with hand motion input in mobile environment. ICMI 2007: 285-290 - [c6]Michiya Yamamoto, Tomio Watanabe:
Analysis by Synthesis of an Information Presentation Method of Embodied Agent Based on the Time Lag Effects of Utterance to Communicative Actions. RO-MAN 2007: 43-48 - 2006
- [c5]Michiya Yamamoto, Tomio Watanabe:
Time Lag Effects of Utterance to Communicative Actions on CG Character-Human Greeting Interaction. RO-MAN 2006: 629-634 - 2005
- [c4]Hiroyuki Nagai, Tomio Watanabe, Michiya Yamamoto:
InterPointer: speech-driven embodied entrainment pointer system. AMT 2005: 213-218 - [c3]Michiya Yamamoto, Tomio Watanabe, Koji Osaki:
Development of an embodied interaction system with InterActor by speech and hand motion input. RO-MAN 2005: 323-328 - 2003
- [c2]Michiya Yamamoto, Tomio Watanabe:
Time delay effects of utterance to communicative actions on greeting interaction by using a voice-driven embodied interaction system. CIRA 2003: 217-222
1990 – 1999
- 1998
- [c1]Michiya Yamamoto, Takuya Mitani, Nobuyuki Ichiguchi, Tetsuo Tezuka, Hidekazu Yoshikawa:
An experimental study on distributed virtual environment for integrated training system on machine maintenance. SMC 1998: 1479-1484
Coauthor Index
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.
Unpaywalled article links
Add open access links from to the list of external document links (if available).
Privacy notice: By enabling the option above, your browser will contact the API of unpaywall.org to load hyperlinks to open access articles. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Unpaywall privacy policy.
Archived links via Wayback Machine
For web page which are no longer available, try to retrieve content from the of the Internet Archive (if available).
Privacy notice: By enabling the option above, your browser will contact the API of archive.org to check for archived content of web pages that are no longer available. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Internet Archive privacy policy.
Reference lists
Add a list of references from , , and to record detail pages.
load references from crossref.org and opencitations.net
Privacy notice: By enabling the option above, your browser will contact the APIs of crossref.org, opencitations.net, and semanticscholar.org to load article reference information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Crossref privacy policy and the OpenCitations privacy policy, as well as the AI2 Privacy Policy covering Semantic Scholar.
Citation data
Add a list of citing articles from and to record detail pages.
load citations from opencitations.net
Privacy notice: By enabling the option above, your browser will contact the API of opencitations.net and semanticscholar.org to load citation information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the OpenCitations privacy policy as well as the AI2 Privacy Policy covering Semantic Scholar.
OpenAlex data
Load additional information about publications from .
Privacy notice: By enabling the option above, your browser will contact the API of openalex.org to load additional information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the information given by OpenAlex.
last updated on 2024-10-07 22:15 CEST by the dblp team
all metadata released as open data under CC0 1.0 license
see also: Terms of Use | Privacy Policy | Imprint