Nothing Special   »   [go: up one dir, main page]

single-au.php

IJAT Vol.17 No.3 pp. 226-236
doi: 10.20965/ijat.2023.p0226
(2023)

Research Paper:

Motion Analysis of the Torso and the Elbow During Picking and Placing Heavy Object on Shelf

Kaito Hara, Satoki Tsuichihara ORCID Icon, and Yasutake Takahashi ORCID Icon

University of Fukui
3-9-1 Bunkyo, Fukui-shi, Fukui 910-8507, Japan

Corresponding author

Received:
October 31, 2022
Accepted:
February 24, 2023
Published:
May 5, 2023
Keywords:
motion capture, motion analysis on redundant part, zero moment point
Abstract

In recent years, labor shortages and workforce aging have become problematic. One solution to this problem is introducing humanoid robots to perform human tasks. This research is aimed at analyzing the posture and preparation posture of the torso and elbow in human motions for improving the motion planning of humanoid robots utilizing the redundant part. We analyze the frontal tilt angle of the torso, time of backward bending the torso, and lateral tilt angle of the elbow when humans apply force. In the experiments, we focused on picking and placing heavy objects on the shelf, which requires maintaining balance and exerting force, and confirmed the differences in movements by changing the weight of the objects. An optical motion capture system and a six-axis load cell were used for measuring the motion. The subjects were asked to approach the shelf from a distance of 1 m and move a heavy object from the upper to the middle height of the shelf. As a result, the heavier the object, the earlier the preparation posture of a backward bending of the torso before grasping. The heavier the object, the larger the tilt angle of the torso forward. Significant differences were observed between the time of backward bending and picking the object, distances from the shoulder to wrist joint, and tilt angle of the torso. We identified the posture and time required to bend the torso for holding a heavier object in front of the shelf, considering the dynamic stability of the Zero-Moment Point.

Cite this article as:
K. Hara, S. Tsuichihara, and Y. Takahashi, “Motion Analysis of the Torso and the Elbow During Picking and Placing Heavy Object on Shelf,” Int. J. Automation Technol., Vol.17 No.3, pp. 226-236, 2023.
Data files:
References
  1. [1] N. Correll, K. E. Bekris, D. Berenson, O. Brock, A. Causo, K. Hauser, K. Okada, A. Rodriguez, J. M. Romano, and P. R. Wurman, “Analysis and observations from the first amazon picking challenge,” IEEE Trans. on Automation Science and Engineering, Vol.15, No.1, pp. 172-188, 2016.
  2. [2] C. J. Lin, C. S. Chen, S. Yu et al., “A GPU-based evolution algorithm for motion planning of a redundant robot,” Int. J. of Robotics and Automation, Vol.2, No.2, 2017.
  3. [3] Y. Yokota, N. Nagasawa, A. Enta, and H. Watanabe, “A study on storing actions and shelf designs in relations to physical stress: An evaluation of shelf designs with less lumbar stress,” J. of Architecture and Planning, Vol.71, No.605, pp. 71-77, 2006.
  4. [4] M. A. Holbein and D. B. Chaffin, “Stability limits in extreme postures: Effects of load positioning, foot placement, and strength,” The J. of Human Factors and Ergonomics Society, Vol.39, No.3, pp. 456-468, 1997.
  5. [5] M. C. Miedema and M. D. JanDul, “Recommended maximum holding times for prevention of discomfort of static standing postures,” Int. J. of Industrial Ergonomics, Vol.19, No.1, pp. 9-18, 1997.
  6. [6] T. Ito, K. Ayusawa, E. Yoshida, and H. Kobayashi, “Simultaneous control framework for humanoid tracking human movement with interacting wearable assistive device,” IEEE Robotics and Automation Letters, Vol.5, No.2, pp. 3604-3611, 2020.
  7. [7] C. Ott, D. Lee, and Y. Nakamura, “Motion capture based human motion recognition and imitation by direct marker control,” Humanoids 2008-8th IEEE-RAS Int. Conf. on Humanoid Robots, pp. 399-405, 2008.
  8. [8] A. G. Kirk, J. F. O’Brien, and D. A. Forsyth, “Skeletal parameter estimation from optical motion capture data,” 2005 IEEE Computer Society Conf. on Computer Vision and Pattern Recognition (CVPR’05), Vol.2, pp. 782-788, 2005.
  9. [9] S. Tsuichihara, Y. Hakamata, G. A. G. Ricardez, J. Takamatsu, and T. Ogasawara, “Real-time whole-body motion generation using torso posture regression and center of mass,” ROBOMECH J., Vol.5, No.1, pp. 1-13, 2018.
  10. [10] O. E. Ramos, L. Saab, S. Hak, and N. Mansard, “Dynamic motion capture and edition using a stack of tasks,” 2011 11th IEEE-RAS Int. Conf. on Humanoid Robots, pp. 224-230, 2011.
  11. [11] Q. Huang, Z. Peng, W. Zhang, L. Zhang, and K. Li, “Design of humanoid complicated dynamic motion based on human motion capture,” 2005 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 3536-3541, 2005.
  12. [12] Q. Huang, Z. Yu, W. Zhang, W. Xu, and X. Chen, “Design and similarity evaluation on humanoid motion based on human motion capture,” Robotica, Vol.28, No.5, pp. 737-745, 2010.
  13. [13] C. Stanton, A. Bogdanovych, and E. Ratanasena, “Teleoperation of a humanoid robot using full-body motion capture, example movements, and machine learning,” Proc. Australasian Conf. on Robotics and Automation, Vol.8, 51, 2012.
  14. [14] Y. Kondo, S. Yamamoto, and Y. Takahashi, “Real-time posture imitation of biped humanoid robot based on particle filter with simple joint control for standing stabilization,” 2016 Joint 8th Int. Conf. on Soft Computing and Intelligent Systems (SCIS) and 17th Int. Symposium on Advanced Intelligent Systems (ISIS), pp. 130-135, 2016.
  15. [15] N. S. Pollard, J. K. Hodgins, M. J. Riley, and C. G. Atkeson, “Adapting human motion for the control of a humanoid robot,” Proc. 2002 IEEE Int. Conf. on Robotics and Automation (Cat. No. 02CH37292), Vol.2, pp. 1390-1397, 2002.
  16. [16] K. Miura, M. Morisawa, S. Nakaoka, F. Kanehiro, K. Harada, K. Kaneko, and S. Kajita, “Robot motion remix based on motion capture data towards human-like locomotion of humanoid robots,” 2009 9th IEEE-RAS Int. Conf. on Humanoid Robots, pp. 596-603, 2009.
  17. [17] K. Ayusawa and E. Yoshida, “Motion retargeting for humanoid robots based on simultaneous morphing parameter identification and motion optimization,” IEEE Trans. on Robotics, Vol.33, No.6, pp. 1343-1357, 2017.
  18. [18] L. Penco, B. Clément, V. Modugno, E. M. Hoffman, G. Nava, D. Pucci, N. G. Tsagarakis, J. B. Mouret, and S. Ivaldi, “Robust real-time whole-body motion retargeting from human to humanoid,” 2018 IEEE-RAS 18th Int. Conf. on Humanoid Robots (Humanoids), pp. 425-432, 2018.
  19. [19] K. Erbatur and O. Kurt, “Natural zmp trajectories for biped robot reference generation,” IEEE Trans. on Industrial Electronics, Vol.56, No.3, pp. 835-845, 2008.
  20. [20] Q. Huang and Y. Nakamura, “Sensory reflex control for humanoid walking,” IEEE Trans. on Robotics, Vol.21, No.5, pp. 977-984, 2005.
  21. [21] T. Takubo, K. Inoue, and T. Arai, “Mobile manipulation control for humanoid robots – control of center of mass position considering the hand reflect forces –,” J. of the Robotics Society of Japan, Vol.24, pp. 614-622, 2006.
  22. [22] W. Takano and Y. Nakamura, “Synthesis of kinematically constrained full-body motion from stochastic motion model,” Autonomous Robots, Vol.43, No.7, pp. 1881-1894, 2019.
  23. [23] R. M. Barnes, “Motion and time study,” John Wiley & Sons, 1949.
  24. [24] J. Obara,“Ergonomics in daily life,” pp. 90-95, Jikkyo Shuppan, 1979, (in Japanese).
  25. [25] M. Vukobratović and J. Stepanenko, “On the stability of anthropomorphic systems,” Mathematical Biosciences, Vol.15, No.1, pp. 1-37, 1972.
  26. [26] S. Kajita, “Humanoid Robot,” pp. 32-100, Ohmsha, 2020 (in Japanese).

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Nov. 22, 2024