default search action
AH 2015: Singapore
- Suranga Nanayakkara, Ellen Yi-Luen Do, Jun Rekimoto, Jochen Huber, Bing-Yu Chen:
Proceedings of the 6th Augmented Human International Conference, AH 2015, Singapore, March 9-11, 2015. ACM 2015, ISBN 978-1-4503-3349-8 - Yuta Itoh, Gudrun Klinker:
Vision enhancement: defocus correction via optical see-through head-mounted displays. 1-8 - Qianli Xu, Michal Mukawa, Liyuan Li, Joo-Hwee Lim, Cheston Tan, Shue-Ching Chia, Tian Gan, Bappaditya Mandal:
Exploring users' attitudes towards social interaction assistance on Google Glass. 9-12 - Katrin Wolf, Jonas D. Willaredt:
PickRing: seamless interaction through pick-up detection. 13-20 - Masa Ogata, Michita Imai:
SkinWatch: skin gesture interaction for smart watch. 21-24 - Yuki Ban, Sho Sakurai, Takuji Narumi, Tomohiro Tanikawa, Michitaka Hirose:
Improving work productivity by controlling the time rate displayed by the virtual clock. 25-32 - Masaharu Hirose, Karin Iwazaki, Kozue Nojiri, Minato Takeda, Yuta Sugiura, Masahiko Inami:
Gravitamine spice: a system that changes the perception of eating through virtual weight sensation. 33-40 - Christoffer Skovgaard, Josephine Raun Thomsen, Nervo Verdezoto, Daniel Vestergaard:
DogPulse: augmenting the coordination of dog walking through an ambient awareness system at home. 41-44 - Tomohiro Yokota, Motohiro Ohtake, Yukihiro Nishimura, Toshiya Yui, Rico Uchikura, Tomoko Hashida:
Snow walking: motion-limiting device that reproduces the experience of walking in deep snow. 45-48 - Enrique Encinas, Konstantia Koulidou, Robb Mitchell:
The kraftwork and the knittstruments: augmenting knitting with sound. 49-52 - Erin Treacy Solovey, Johanna Okerlund, Cassie Hoef, Jasmine Davis, Orit Shaer:
Augmenting spatial skills with semi-immersive interactive desktop displays: do immersion cues matter? 53-60 - Anusha I. Withana, Shunsuke Koyama, Daniel Saakes, Kouta Minamizawa, Masahiko Inami, Suranga Nanayakkara:
RippleTouch: initial exploration of a wave resonant based full body haptic interface. 61-68 - Robert Peter Matthew, Victor Shia, Masayoshi Tomizuka, Ruzena Bajcsy:
Optimal design for individualised passive assistance. 69-76 - Mahasak Surakijboworn, Witaya Wannasuphoprasit:
Design of a novel finger exoskeleton with a sliding six-bar joint mechanism. 77-80 - Kota Shimozuru, Tsutomu Terada, Masahiko Tsukamoto:
A life log system that recognizes the objects in a pocket. 81-88 - Masasuke Yasumoto, Takehiro Teraoka:
VISTouch: dynamic three-dimensional connection between multiple mobile devices. 89-92 - Hideki Koike, Hiroaki Yamaguchi:
LumoSpheres: real-time tracking of flying objects and image projection for a volumetric display. 93-96 - Jonathan Mercier-Ganady, Maud Marchal, Anatole Lécuyer:
B-C-invisibility power: introducing optical camouflage based on mental activity in augmented reality. 97-100 - Kelly Yap, Clement Zheng, Angela Tay, Ching-Chiuan Yen, Ellen Yi-Luen Do:
Word out!: learning the alphabet through full body interactions. 101-108 - Ming Chang, Hiroyuki Iizuka, Yasushi Naruse, Hideyuki Ando, Taro Maeda:
Unconscious learning of speech sounds using mismatch negativity neurofeedback. 109-112 - Yoko Nakanishi, Yasuto Nakanishi:
Use of an intermediate face between a learner and a teacher in second language learning with shadowing. 113-116 - Tilman Dingler, Alireza Sahami Shirazi, Kai Kunze, Albrecht Schmidt:
Assessment of stimuli for supporting speed reading on electronic devices. 117-124 - Kai Kunze, Masai Katsutoshi, Yuji Uema, Masahiko Inami:
How much do you read?: counting the number of words a user reads using electrooculography. 125-128 - Ayaka Sato, Jun Rekimoto:
Designable sports field: sport design by a human in accordance with the physical status of the player. 129-136 - Takuya Nojima, Ngoc Phuong, Takahiro Kai, Toshiki Sato, Hideki Koike:
Augmented dodgeball: an approach to designing augmented sports. 137-140 - Samantha Bielli, Christopher G. Harris:
A mobile augmented reality system to enhance live sporting events. 141-144 - Takeo Hamada, Hironori Mitake, Shoichi Hasegawa, Makoto Sato:
A teleoperated bottom wiper. 145-150 - Laurens Boer, Nico Hansen, Ragna L. Möller, Ana I. C. Neto, Anne H. Nielsen, Robb Mitchell:
The toilet companion: a toilet brush that should be there for you and not for others. 151-154 - Nick Nielsen, Sandra B. P. S. Pedersen, Jens A. Sørensen, Nervo Verdezoto, Nikolai H. Øllegaard:
EcoBears: augmenting everyday appliances with symbolic and peripheral feedback. 155-156 - Takuya Iwamoto, Soh Masuko:
Lovable couch: mitigating distrustful feelings for couples by visualizing excitation. 157-158 - Shohei Nagai, Shunichi Kasahara, Jun Rekimoto:
Directional communication using spatial sound in human-telepresence. 159-160 - Masaharu Hirose, Yuta Sugiura, Kouta Minamizawa, Masahiko Inami:
PukuPuCam: a recording system from third-person view in scuba diving. 161-162 - Kai Kunze, Susana Sanchez, Tilman Dingler, Olivier Augereau, Koichi Kise, Masahiko Inami, Tsutomu Terada:
The augmented narrative: toward estimating reader engagement. 163-164 - Enrique Encinas, Robb Mitchell:
Cyrafour: an experiential activity facilitating empathic distant communication among copresent individuals. 165-166 - Kin Fuai Yong, Juan Pablo Forero, Shaohui Foong, Suranga Nanayakkara:
FootNote: designing a cost effective plantar pressure monitoring system for diabetic foot ulcer prevention. 167-168 - Aman Srivastava, Sanskriti Dawle:
Mudra: a multimodal interface for braille teaching. 169-170 - Hirohiko Hayakawa, Charith Lasantha Fernando, MHD Yamen Saraiji, Kouta Minamizawa, Susumu Tachi:
Telexistence drone: design of a flight telexistence system for immersive aerial sports experience. 171-172 - Robb Mitchell, Alexandra Papadimitriou, Youran You, Laurens Boer:
Really eating together: a kinetic table to synchronise social dining experiences. 173-174 - Takanori Komatsu, Chihaya Kuwahara:
Extracting users' intended nuances from their expressed movements: in quadruple movements. 175-176 - Hung-Yu Tseng, Rong-Hao Liang, Li-Wei Chan, Bing-Yu Chen:
Using point-light movement as peripheral visual guidance for scooter navigation. 177-178 - Santiago Ortega-Avila, Bogdana Rakova, Sajid Sadi, Pranav Mistry:
Non-invasive optical detection of hand gestures. 179-180 - Paul Strohmeier:
DIY IR sensors for augmenting objects and human skin. 181-182 - Takaaki Hayashizaki, Atsuhiro Fujita, Junki Nozawa, Shohei Ueda, Koichi Hirota, Yasushi Ikei, Michiteru Kitazaki:
Walking experience by real-scene optic flow with synchronized vibrations on feet. 183-184 - Soh Masuko, Ryosuke Kuroki:
AR-HITOKE: visualizing popularity of brick and mortar shops to support purchase decisions. 185-186 - Takuya Nojima, Miki Yamamura, Junichi Kanebako, Lisako Ishigami, Mage Xue, Hiroko Uchiyama, Naoko Yamazaki:
Bio-Collar: a wearable optic-kinetic display for awareness of bio-status. 187-188 - Naoto Uekusa, Takafumi Koike:
Superimposed projection of ghosted view on real object with color correction. 189-190 - Yukika Aruga, Takafumi Koike:
Taste change of soup by the recreating of sourness and saltiness using the electrical stimulation. 191-192 - Daniel Wessolek, Jochen Huber, Pattie Maes:
Body as display: augmenting the face through transillumination. 193-194 - Shinya Kudo, Ryuta Okazaki, Taku Hachisu, Michi Sato, Hiroyuki Kajimoto:
Personally supported dynamic random dot stereogram by measuring binocular parallax. 195-196 - Jonathan Mercier-Ganady, Maud Marchal, Anatole Lécuyer:
The mind-window: brain activity visualization using tablet-based AR and EEG for multiple users. 197-198 - Shogo Yamashita, Adiyan Mujibiya:
POVeye: enhancing e-commerce product visualization by providing realistic image based point-of-view. 199-200 - Daichi Nagata, Yutaka Arakawa, Takatomi Kubo, Keiichi Yasumoto:
Effective napping support system by hypnagogic time estimation based on heart rate sensor. 201-202 - Matthew Swarts, Nicholas M. Davis, Chih-Pin Hsiao, James Hallam:
Sharing the lights: exploration on teaching electronics for sensory augmentation development. 203-204 - Nan-Ching Tai:
Nested perspective: an art installation that intersects the physical and virtual social worlds. 205-206 - Kazuya Murao:
Wearable text input interface using touch typing skills. 207-208 - Galit Buchs, Shachar Maidenbaum, Amir Amedi:
Augmented non-visual distance sensing with the EyeCane. 209-210 - Jun Nishida, Kanako Takahashi, Kenji Suzuki:
A wearable stimulation device for sharing and augmenting kinesthetic feedback. 211-212 - Roshan Lalintha Peiris, Vikum Wijesinghe, Suranga Nanayakkara:
SHRUG: stroke haptic rehabilitation using gaming. 213-214 - Jochen Huber, Hasantha Malavipathirana, Yikun Wang, Xinyu Li, Jody C. Fu, Pattie Maes, Suranga Nanayakkara:
Feel & see the globe: a thermal, interactive installation. 215-216 - Benjamin Petry, Jochen Huber:
Towards effective interaction with omnidirectional videos using immersive virtual reality headsets. 217-218 - Pawel W. Wozniak, Kristina Knaving, Mohammad Obaid, Marta Gonzalez Carcedo, Ayça Ünlüer, Morten Fjeld:
ChromaGlove: a wearable haptic feedback device for colour recognition. 219-220 - MHD Yamen Saraiji, Charith Lasantha Fernando, Kouta Minamizawa, Susumu Tachi:
Mutual hand representation for telexistence robots using projected virtual hands. 221-222 - Haruki Nakamura, Nobuhisa Hanamitsu, Kouta Minamizawa:
A(touch)ment: a smartphone extension for instantly sharing visual and tactile experience. 223-224 - Kohei Ikeda, Koji Tsukada:
CapacitiveMarker: novel interaction method using visual marker integrated with conductive pattern. 225-226 - Yusuke Mizushina, Wataru Fujiwara, Tomoaki Sudou, Charith Lasantha Fernando, Kouta Minamizawa, Susumu Tachi:
Interactive instant replay: sharing sports experience using 360-degrees spherical images and haptic sensation based on the coupled body motion. 227-228
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.