Abstract
Autonomous underwater vehicle (AUV) plays an important role in ocean research and exploration. The underwater environment has a great influence on AUV control and human–robot interaction, since underwater environment is highly dynamic with unpredictable fluctuation of water flow, high pressure and light attenuation. The traditional control model contains a large number of parameters, which is not effective and produces errors frequently. The proposal of fuzzy control addressed this issue to a certain extent. It applies fuzzy variables to the controller, which replace the values in an interval. In addition to the controller, underwater human–robot interaction is also difficult. Divers cannot speak or show any facial expressions underwater. The buttons on the AUV also need to overcome the huge water pressure. In this paper, we proposed a method to recognize the gesture instructions and apply it to the fuzzy control of AUV. Our contribution is the gesture recognition framework for the human–robot interaction, including the gesture detection network and the algorithm for the control of AUV. The experiment result shows the efficiency of the proposed method.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Papanikolopoulos, N.P., Khosla, P.K.: Adaptive robotic visual tracking: theory and experiments. IEEE Trans. Autom. Control 38(3), 429–445 (1993)
Jiang, Y., Zhao, M., Hu, C., He, L., Bai, H., Wang, J.: A parallel FP-growth algorithm on World Ocean Atlas data with multi-core CPU. J Supercomput 75(2), 732–745 (2019)
Qin, H., Wang, C., Jiang, Y., Deng, Z., Zhang, W.: Trend prediction of the 3D thermocline’s lateral boundary based on the SVR method. EURASIP J Wirel Commun Netw. 2018, 252 (2018)
Dudek, G., et al.: A visually guided swimming robot, IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3604–3609 (2005)
Son-Cheol, Y., Ura, T., Fujii, T., Kondo, H.: Navigation of autonomous underwater vehicles based on artificial underwater landmarks, Oceans IEEE, pp. 409–416 (2001)
Sattar J., Dudek, G.: Robust servo-control for underwater robots using banks of visual filters, IEEE International Conference on Robotics and Automation, pp. 3583–3588 (2009)
Neverova, N., Wolf, C., Taylor, G.W, Nebout, F.: Multi-scale deep learning for gesture detection and localization, Workshop at the European Conference on Computer Vision. Springer, pp. 474–490 (2015)
Ren, S., He, K., Girshick, R., Sun, J., Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks, IEEE Transactions on Pattern Analysis and Machine Intelligence, pp. 1137–1149 (2017)
Fiala, M.: ARTag, a fiducial marker system using digital techniques, IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), 2005, pp. 590–596
Olson, E.: AprilTag: A robust and flexible visual fiducial system, IEEE International Conference on Robotics and Automation, pp. 3400–3407 (2011)
Dudek, G., Sattar, J., Xu, A.: A visual language for robot control and programming: A human-interface study, IEEE International Conference on Robotics and Automation, pp. 2507–2513 (2007)
Molchanov, P., Gupta, S., Kim, K., Kautz, J.: Hand gesture recognition with 3D convolutional neural networks, IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2015, pp. 1–7
Girshick, R.: Fast R-CNN, IEEE International Conference on Computer Vision (ICCV), pp. 1440–1448 (2015)
He, K., Gkioxari, G., Dollár P., Girshick, R.: Mask R-CNN, IEEE International Conference on Computer Vision (ICCV), pp. 2980–2988 (2017)
Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu C., Berg, A.C. SSD: Single Shot MultiBox Detector, European Conference on Computer Vision (ECCV 2016), (2015)
Redmon, J., Divvala, S., Girshick. R., Farhadi, A., You only look once: unified, real-time object detection, IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 779–788 (2016)
Jeen-Shing, W., Lee, C.S.G., Self-adaptive recurrent neuro-fuzzy control for an autonomous underwater vehicle, IEEE International Conference on Robotics and Automation, pp. 1095–1100 (2002)
Zhang, L., Pang, Y., Su, Y., et al.: HPSO-based fuzzy neural network control for AUV. J. Control Theory Appl. 6, 322–326 (2008)
Huang, W., Fang, H., Liu, L., Obstacle avoiding policy of multi-AUV formation based on virtual AUV, Sixth International Conference on Fuzzy Systems and Knowledge Discovery, pp. 131–135 (2009)
Li, Y., Pang, Y., Wan, L., Tang, X., A fuzzy motion control of AUV based on apery intelligence, Chinese Control and Decision Conference, pp. 1316–1321 (2009)
Li, Q., Shi, X.H., Kang, Z.Q.: The research of Fuzzy-PID control based on grey prediction for AUV. Appl. Mech. Mater. 246–247, 888–892 (2012)
Liang, X., Qu, X., Wan, L., Ma, Q.: Three-dimensional path following of an underactuated AUV based on Fuzzy backstepping sliding mode control. Int. J. Fuzzy Syst. 20, 640–649 (2018)
Yu, C, Xiang, X., Wilson, P.A., Guidance-Error-Based Robust Fuzzy Adaptive Control for Bottom Following of a Flight-Style AUV With Saturated Actuator Dynamics, IEEE Transactions on Cybernetics, 1–13 (2019)
Li, H., He, B., Yin, Q.: Fuzzy Optimized MFAC Based on ADRC in AUV Heading Control. Electronics. 8(6), 608 (2019)
Gomez Chavez, A., Ranieri, A., Chiarella, D., Zereik, E., Babić, A., Birk, A.: Caddy underwater stereo-vision dataset for human–robot interaction (HRI) in the context of diver activities. J. Marine Sci. Eng. 7(1), 16 (2019)
Acknowledgements
This work was supported by the National Natural Science Foundation of China under Grant 51679105, Grant 51809112, and Grant 51939003.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Jiang, Y., Peng, X., Xue, M. et al. An Underwater Human–Robot Interaction Using Hand Gestures for Fuzzy Control. Int. J. Fuzzy Syst. 23, 1879–1889 (2021). https://doi.org/10.1007/s40815-020-00946-2
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40815-020-00946-2