Paper:
Development of Deskwork Support System Using Pointing Gesture Interface
Masao Sugi*, Hisato Nakanishi**, Masataka Nishino***,
Yusuke Tamura**, Tamio Arai**, and Jun Ota**
*Tokyo University of Agriculture and Technology
**The University of Tokyo
***Honda Motor Co. Ltd.
- [1] T. Sato, Y. Nishida, and H. Mizoguchi, “Robotic room: symbiosis with human through behavior media,” Robotics and Autonomous Systems, Vol.18, pp. 185-194, 1996.
- [2] R. Brooks, “The intelligent room project,” Proc. of the 2nd Int. Cognitive Technology Conf., pp. 271-278, 1997.
- [3] P. Wellner, “Interacting with paper on the DigitalDesk,” Communications of the ACM, Vol.36, No.7, pp. 87-96, 1993.
- [4] B. Ullmer and H. Ishii, “The metaDESK: models and prototypes for tangible user interfaces,” Proc. of the 10th annual ACM symposium on User Interface Software and Technology, pp. 223-232, 1997.
- [5] K. Oka, Y. Sato, and H. Koike, “Real-time fingertip tracking and gesture recognition,” IEEE Computer Graphics and Applications, Vol.22, No.6, pp. 64-71, 2002.
- [6] M. Sugi, M. Nikaido, Y. Tamura, J. Ota, T. Arai, K. Kotani, K. Takamasu, S. Shin, H. Suzuki, and Y. Sato, “Motion Control of Self-Moving Trays for Human Supporting Production Cell ‘Attentive Workbench’,” Proc. of the 2005 IEEE Int. Conf. on Robotics and Automation, pp. 4091-4096, 2005.
- [7] K. Kotani, K. Takamasu, Y. Ashkenazy, H. E. Stanley, and Y. Yamamoto, “Model for cardiorespiratory synchronization in humans,” Physical Review E, Vol.65, 051923, pp. 1-9, 2002.
- [8] M. Sugi, I. Matsumura, Y. Tamura, M. Nikaido, J. Ota, T. Arai, K. Kotani, K. Takamasu, H. Suzuki, A. Yamamoto, Y. Sato, S. Shin, and F. Kimura, “Quantitative Evaluation of Automatic Parts Delivery in “Attentive Workbench” Supporting Workers in Cell Production,” J. of Robotics and Mechatronics, Vol.21, No.1, pp. 135-145, 2009.
- [9] Y. Tamura, M. Sugi, J. Ota, and T. Arai, “Deskwork Support System Based on the Estimation of Human Intention,” Proc. of the 13th IEEE Int. Workshop on Robot and Human Interactive Communication (RO-MAN 2004), pp. 413-418, 2004.
- [10] M. Kavrakli, M. Taylor, and A. Trapeznikov, “Designing in Virtual Reality (DesIRe): A Gesture-Based Interface,” Proc. of the 2nd Int. Conf. on Digital Interactive Media in Entertainment and Arts, pp. 131-136, 2007.
- [11] A. F. Abate, M. De Marsico, S. Levialdi, V. Mastronardi, S. Ricciardi, and G. Tortora, “Gesture Based Interface for Crime Scene Analysis: A Proposal,” Proc. of the Int. Conf. on Computational Science and Its Applications, Part II, pp. 143-154, 2008.
- [12] B. A. Sawyer, “Magnetic Positioning Device,” US patent, Vol.3, pp. 457-482, 1969.
- [13] X. Chen, K. Takamasu, and M. Nikaidou, “Evaluation of thrust force and positioning accuracy of a new linear motor,” Proc. of the 6th Int. Symposium on Measurement Technology and Intelligent Instruments, p. 126, 2003.
- [14] M. Erdmann, T. Lozano-Pérez, “On Multiple Moving Objects,” Algorithmica, Vol.2, pp. 477-521, 1987.
- [15] K. Tsukada and M. Yasumura, “Ubi-Finger: Gesture Input Device for Mobile Use,” Proc. of the 5th Asia Pacific Conf. on Computer Human Interaction (APCHI 2002), Vol.1, pp. 388-400, 2002.
- [16] K. Irie, N. Wakamura, and K. Umeda, “Construction of an Intelligent Room Based on Gesture Recognition –Operation of Electric Appliances with Hand Gestures–,” Proc. of the 2004 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 193-198, 2004.
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 Internationa License.
Copyright© 2010 by Fuji Technology Press Ltd. and Japan Society of Mechanical Engineers. All right reserved.