Human-Centered Navigation and Person-Following with Omnidirectional Robot for Indoor Assistance and Monitoring
<p>Visualization of the human-centered navigation service task for domestic robot assistance: the rover has to reach different goals while keep monitoring the user.</p> "> Figure 2
<p>Visualization of human-centered person-following task for domestic assistance: the omnidirectional capability allows the rover to follow the user maintaining its orientation towards them while avoiding obstacles. (<b>a</b>) The robot can follow the same path of the person while avoiding obstacles. (<b>b</b>) The robot must follow a different path keeping active the monitoring of the person.</p> "> Figure 3
<p>The omnidirectional platform we set up for experimentation and validation of our novel methodology. The vertical shaft allows the camera to be elevated over most indoor environment obstacles.</p> "> Figure 4
<p>Real-time visual perception pipeline for person identification, tracking, and coordinate extraction. <math display="inline"><semantics> <mrow> <mo>[</mo> <msub> <mi>x</mi> <mi>C</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>C</mi> </msub> <mo>]</mo> </mrow> </semantics></math> are the coordinate of the person’s center in the image frame, while <math display="inline"><semantics> <mrow> <mo>[</mo> <msub> <mi>x</mi> <mi>p</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>p</mi> </msub> <mo>]</mo> </mrow> </semantics></math> indicates it in the robot reference frame. The person’s pose is continuously estimated by PoseNet at 30 fps and tracked with SORT, then a set of reliable pose key-points are used to extract the center coordinates.</p> "> Figure 5
<p>Human-centered navigation methodology pipeline scheme. Linear and angular velocity <math display="inline"><semantics> <mrow> <mo>[</mo> <msub> <mi>v</mi> <mi>x</mi> </msub> <mo>,</mo> <msub> <mi>v</mi> <mi>y</mi> </msub> <mo>,</mo> <mo>ω</mo> <mo>]</mo> </mrow> </semantics></math> are generated separately to successfully carry out obstacle avoidance through local trajectory planning together with person monitoring through yaw control.</p> "> Figure 6
<p>Omnidirectional person-centered navigation results in two scenarios with the person in different positions: in (<b>a</b>) the person is close to the navigation goal, in (<b>b</b>) the person is at the corner of the hallway. Red arrows indicate position and orientation of the rover at different time instants, the blue point is the person’s position, while the orange spline represents the path crossed by the rover.</p> "> Figure 7
<p>Qualitative visualization of the four scenarios set up for the person-following test. In the upper row, a schematic representation is shown, where red objects represent low-height obstacles over which the robot’s camera can see. In the lower row, the real testing area with the robot is shown.</p> "> Figure 8
<p>Person-following results in the first two scenarios: scenario 1 is composed of a wide U-shaped path, while scenario 2 presents narrow passages through obstacles. Red arrows indicate position and orientation of the rover associated with the person’s position (blue point) at the same instant. The orange spline represents the path crossed by the rover. (<b>a</b>) Scenario 1—Omnidirectional configuration; (<b>b</b>) Scenario 1—Differential configuration; (<b>c</b>) Scenario 2—Omnidirectional configuration; (<b>d</b>) Scenario 2—Differential configuration.</p> "> Figure 9
<p>Person-following results in the third and fourth scenario: scenario 3 presents a high number of obstacles and possible paths, while scenario 4 is composed of a high <math display="inline"><semantics> <msup> <mn>90</mn> <mo>∘</mo> </msup> </semantics></math> wall to be circumnavigated. Red arrows indicate position and orientation of the rover associated with the person’s position (blue point) at the same instant. The orange spline represents the path crossed by the rover. (<b>a</b>) Scenario 3—Omnidirectional configuration; (<b>b</b>) Scenario 3—Differential configuration; (<b>c</b>) Scenario 4—Omnidirectional configuration; (<b>d</b>) Scenario 4—Differential configuration.</p> ">
Abstract
:1. Introduction
- We identify an omnidirectional motion planning approach as a robust, effective solution to boost the mobility of a robotic assistant during its principal navigation activities (person-following and goal-based navigation);
- We set up a real-time, cost-effective perception pipeline to extract the coordinate of the person and visually track its pose;
- We effectively integrate a navigation algorithmic stack that separately handles trajectory generation for obstacle avoidance and orientation control for person monitoring.
2. Related Works
2.1. Person Identification and Tracking
2.2. Navigation and Obstacle Avoidance
3. Human-Centered Autonomous Navigation
3.1. Perception and Tracking
- Especially in crowded environments, where multiple people are present in every frame, the subject could be mistaken for another person in the image (or vice-versa);
- Without a component capable of tracking observations at previous time instants, it could be very difficult to guarantee real-time performances if the detection of the subject is lost for a few consecutive frames. This problem can be particularly critical in all those situations with an occluded view of the subject due to obstacles or other people present in the scene.
3.2. Omnidirectional Motion Planner and Obstacle Avoidance
3.3. Person-Focused Orientation Control
- k is a parameter used to linearly increase as grows;
- is another parameter used to limit the maximum value assumed by .
4. Experiments and Results
- The first experimental stage aims at demonstrating the efficiency of the person-centered navigation task for monitoring purposes, where the rover has to navigate from a point A to a target point B of coordinate , maintaining its focus on the subject located in ;
- The second series of experiments take into consideration the person-following task, where and coincide and represent the dynamic goal obtained from the visual perception pipeline, which identifies and tracks the person of interest.
- Safety distance module During the rover operation, the user’s safety should always be ensured, even if this leads to the failure of the requested task. For this reason, a module able to truncate the navigation path of the rover is inserted, which guarantees a minimum distance of one meter always to be maintained from any person.
- Recovery policy for person tracking During the navigation towards a specified goal, the rover may lose track of the person. In case the track is not resumed within a certain time interval, a specific module we added sends a command to the rover to interrupt the navigation and to start rotating towards the direction the person was last perceived in an attempt to regain visual contact with the user.
- Recovery policy for person-following The same problem described above can occur during the person-following task but, in this case, consequences could be even worse since the knowledge of the person’s position affects not only the yaw but also the linear directions of the navigation. To re-establish track with the person, first of all the rover heads towards the last known position of the user, maintaining its orientation towards that location. This decision compensate for all those cases in which the person takes a turn behind an obstacle, such as a wall, and simply moving towards the corner of the curve where the user was last seen is enough to regain visual contact. If this should reveal not sufficient, once the robot has reached the last known position, it starts rotating as described before.
4.1. Person-Centered Navigation
4.2. Person Following
5. Conclusions and Future Works
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Martinez-Martin, E.; del Pobil, A.P. Personal robot assistants for elderly care: An overview. In Personal Assistants: Emerging Computational Technologies; Springer: Cham, Switzerland, 2018; pp. 77–91. [Google Scholar]
- Vercelli, A.; Rainero, I.; Ciferri, L.; Boido, M.; Pirri, F. Robots in elderly care. Digit.-Sci. J. Digit. Cult. 2018, 2, 37–50. [Google Scholar]
- United-Nations. Shifting Demographics; United-Nations: New York, NY, USA, 2019. [Google Scholar]
- Novak, L.L.; Sebastian, J.G.; Lustig, T.A. The World Has Changed: Emerging Challenges for Health Care Research to Reduce Social Isolation and Loneliness Related to COVID-19. NAM Perspect. 2020, 2020. [Google Scholar] [CrossRef] [PubMed]
- Shen, Y.; Guo, D.; Long, F.; Mateos, L.A.; Ding, H.; Xiu, Z.; Hellman, R.B.; King, A.; Chen, S.; Zhang, C.; et al. Robots under COVID-19 pandemic: A comprehensive survey. IEEE Access 2020, 9, 1590–1615. [Google Scholar] [CrossRef] [PubMed]
- Abdi, J.; Al-Hindawi, A.; Ng, T.; Vizcaychipi, M.P. Scoping review on the use of socially assistive robot technology in elderly care. BMJ Open 2018, 8, e018815. [Google Scholar] [CrossRef] [Green Version]
- Góngora Alonso, S.; Hamrioui, S.; de la Torre Díez, I.; Motta Cruz, E.; López-Coronado, M.; Franco, M. Social robots for people with aging and dementia: A systematic review of literature. Telemed. E-Health 2019, 25, 533–540. [Google Scholar] [CrossRef]
- Gasteiger, N.; Loveys, K.; Law, M.; Broadbent, E. Friends from the Future: A Scoping Review of Research into Robots and Computer Agents to Combat Loneliness in Older People. Clin. Interv. Aging 2021, 16, 941. [Google Scholar] [CrossRef]
- Yatsuda, A.; Haramaki, T.; Nishino, H. A Study on Robot Motions Inducing Awareness for Elderly Care. In Proceedings of the 2018 IEEE International Conference on Consumer Electronics-Taiwan (ICCE-TW), Taichung, Taiwan, 19–21 May 2018; pp. 1–2. [Google Scholar] [CrossRef]
- Möller, R.; Furnari, A.; Battiato, S.; Härmä, A.; Farinella, G.M. A survey on human-aware robot navigation. Robot. Auton. Syst. 2021, 145, 103837. [Google Scholar] [CrossRef]
- Islam, M.J.; Hong, J.; Sattar, J. Person-following by autonomous robots: A categorical overview. Int. J. Robot. Res. 2019, 38, 1581–1618. [Google Scholar] [CrossRef] [Green Version]
- Honig, S.S.; Oron-Gilad, T.; Zaichyk, H.; Sarne-Fleischmann, V.; Olatunji, S.; Edan, Y. Toward socially aware person-following robots. IEEE Trans. Cogn. Dev. Syst. 2018, 10, 936–954. [Google Scholar] [CrossRef]
- Eirale, A.; Martini, M.; Tagliavini, L.; Gandini, D.; Chiaberge, M.; Quaglia, G. Marvin: An Innovative Omni-Directional Robotic Assistant for Domestic Environments. Sensors 2022, 22, 5261. [Google Scholar] [CrossRef]
- Jia, D.; Hermans, A.; Leibe, B. DR-SPAAM: A Spatial-Attention and Auto-regressive Model for Person Detection in 2D Range Data. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October 2020–24 January 2021; pp. 10270–10277. [Google Scholar] [CrossRef]
- Cha, D.; Chung, W. Human-Leg Detection in 3D Feature Space for a Person-Following Mobile Robot Using 2D LiDARs. Int. J. Precis. Eng. Manuf. 2020, 21, 1299–1307. [Google Scholar] [CrossRef]
- Guerrero-Higueras, Á.M.; Álvarez-Aparicio, C.; Calvo Olivera, M.C.; Rodríguez-Lera, F.J.; Fernández-Llamas, C.; Rico, F.M.; Matellán, V. Tracking People in a Mobile Robot from 2D LIDAR Scans Using Full Convolutional Neural Networks for Security in Cluttered Environments. Front. Neurorobotics 2019, 12, 85. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wang, W.; Liu, P.; Ying, R.; Wang, J.; Qian, J.; Jia, J.; Gao, J. A High-Computational Efficiency Human Detection and Flow Estimation Method Based on TOF Measurements. Sensors 2019, 19, 729. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zoghlami, F.; Sen, O.K.; Heinrich, H.; Schneider, G.; Ercelik, E.; Knoll, A.; Villmann, T. ToF/Radar early feature-based fusion system for human detection and tracking. In Proceedings of the 2021 22nd IEEE International Conference on Industrial Technology (ICIT), Valencia, Spain, 10–12 March 2021; Volume 1, pp. 942–949. [Google Scholar] [CrossRef]
- Zhao, Z.Q. Object detection with deep learning: A review. IEEE Trans. Neural Netw. Learn. Syst. 2019, 30, 3212–3232. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zhang, X.; Chen, Z.; Wu, Q.J.; Cai, L.; Lu, D.; Li, X. Fast semantic segmentation for scene perception. IEEE Trans. Ind. Inf. 2018, 15, 1183–1192. [Google Scholar] [CrossRef]
- Cao, Z.; Simon, T.; Wei, S.E.; Sheikh, Y. OpenPose: Realtime multi-person 2D pose estimation using Part Affinity Fields. IEEE Trans. Pattern Anal. Mach. Intell. 2019, 43, 172–186. [Google Scholar] [CrossRef] [Green Version]
- Gupta, M.; Kumar, S.; Behera, L.; Subramanian, V.K. A novel vision-based tracking algorithm for a human-following mobile robot. IEEE Trans. Syst. Man Cybern. Syst. 2016, 47, 1415–1427. [Google Scholar] [CrossRef]
- Koide, K.; Miura, J.; Menegatti, E. Monocular person tracking and identification with on-line deep feature selection for person following robots. Robot. Auton. Syst. 2020, 124, 103348. [Google Scholar] [CrossRef]
- Koide, K.; Miura, J. Identification of a specific person using color, height, and gait features for a person following robot. Robot. Auton. Syst. 2016, 84, 76–87. [Google Scholar] [CrossRef]
- Eisenbach, M.; Vorndran, A.; Sorge, S.; Gross, H.M. User recognition for guiding and following people with a mobile robot in a clinical environment. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 3600–3607. [Google Scholar]
- Wang, M.; Liu, Y.; Su, D.; Liao, Y.; Shi, L.; Xu, J.; Miro, J.V. Accurate and real-time 3-D tracking for the following robots by fusing vision and ultrasonar information. IEEE/ASME Trans. Mechatron. 2018, 23, 997–1006. [Google Scholar] [CrossRef]
- Chi, W.; Wang, J.; Meng, M.Q.H. A gait recognition method for human following in service robots. IEEE Trans. Syst. Man Cybern. Syst. 2017, 48, 1429–1440. [Google Scholar] [CrossRef]
- Kobilarov, M.; Sukhatme, G.; Hyams, J.; Batavia, P. People tracking and following with mobile robot using an omnidirectional camera and a laser. In Proceedings of the 2006 IEEE International Conference on Robotics and Automation, ICRA 2006, Orlando, FL, USA, 15–19 May 2006; pp. 557–562. [Google Scholar]
- Huh, S.; Shim, D.H.; Kim, J. Integrated navigation system using camera and gimbaled laser scanner for indoor and outdoor autonomous flight of UAVs. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 3158–3163. [Google Scholar]
- Boschi, A.; Salvetti, F.; Mazzia, V.; Chiaberge, M. A cost-effective person-following system for assistive unmanned vehicles with deep learning at the edge. Machines 2020, 8, 49. [Google Scholar] [CrossRef]
- Pang, L.; Zhang, Y.; Coleman, S.; Cao, H. Efficient hybrid-supervised deep reinforcement learning for person following robot. J. Intell. Robot. Syst. 2020, 97, 299–312. [Google Scholar] [CrossRef]
- Chen, B.X.; Sahdev, R.; Tsotsos, J.K. Integrating stereo vision with a CNN tracker for a person-following robot. In Proceedings of the International Conference on Computer Vision Systems, Shenzhen, China, 10–13 July 2017; pp. 300–313. [Google Scholar]
- Cen, M.; Huang, Y.; Zhong, X.; Peng, X.; Zou, C. Real-time Obstacle Avoidance and Person Following Based on Adaptive Window Approach. In Proceedings of the 2019 IEEE International Conference on Mechatronics and Automation (ICMA), Tianjin, China, 4–7 August 2019; pp. 64–69. [Google Scholar]
- Zhang, K.; Zhang, L. Autonomous following indoor omnidirectional mobile robot. In Proceedings of the 2018 Chinese Control and Decision Conference (CCDC), Shenyang, China, 9–11 June 2018; pp. 461–466. [Google Scholar]
- Chen, C.W.; Tseng, S.P.; Hsu, Y.T.; Wang, J.F. Design and implementation of human following for separable omnidirectional mobile system of smart home robot. In Proceedings of the 2017 International Conference on Orange Technologies (ICOT), Singapore, 8–10 December 2017; pp. 210–213. [Google Scholar]
- Papandreou, G. PersonLab: Person Pose Estimation and Instance Segmentation with a Bottom-Up, Part-Based, Geometric Embedding Model. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018. [Google Scholar]
- Bewley, A.; Ge, Z.; Ott, L.; Ramos, F.; Upcroft, B. Simple Online and Realtime Tracking. In Proceedings of the 2016 IEEE International Conference on Image Processing (ICIP), Phoenix, AZ, USA, 25–28 September 2016. [Google Scholar]
- Saha, O.; Dasgupta, P. A Comprehensive Survey of Recent Trends in Cloud Robotics Architectures and Applications. Robotics 2018, 7, 47. [Google Scholar] [CrossRef] [Green Version]
- Maruyama, Y.; Kato, S.; Azumi, T. Exploring the Performance of ROS2; EMSOFT ’16; Association for Computing Machinery: New York, NY, USA, 2016. [Google Scholar] [CrossRef]
Error | Mean | Std. Dev. | RMSE | MAE |
---|---|---|---|---|
First Scenario | ||||
Omnidir. | ||||
Differential | ||||
Improvement | ||||
Second Scenario | ||||
Omnidir. | ||||
Differential | ||||
Improvement |
Error | Mean | Std. Dev. | RMSE | MAE |
---|---|---|---|---|
Scenario A | ||||
Omnidir. | ||||
Differential | ||||
Improvement | ||||
Scenario B | ||||
Omnidir. | ||||
Differential | ||||
Improvement | ||||
Scenario C | ||||
Omnidir. | ||||
Differential | ||||
Improvement | ||||
Scenario D | ||||
Omnidir. | ||||
Differential | ||||
Improvement |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Eirale, A.; Martini, M.; Chiaberge, M. Human-Centered Navigation and Person-Following with Omnidirectional Robot for Indoor Assistance and Monitoring. Robotics 2022, 11, 108. https://doi.org/10.3390/robotics11050108
Eirale A, Martini M, Chiaberge M. Human-Centered Navigation and Person-Following with Omnidirectional Robot for Indoor Assistance and Monitoring. Robotics. 2022; 11(5):108. https://doi.org/10.3390/robotics11050108
Chicago/Turabian StyleEirale, Andrea, Mauro Martini, and Marcello Chiaberge. 2022. "Human-Centered Navigation and Person-Following with Omnidirectional Robot for Indoor Assistance and Monitoring" Robotics 11, no. 5: 108. https://doi.org/10.3390/robotics11050108