Nothing Special   »   [go: up one dir, main page]

Next Article in Journal
A Hybrid Motion Planning Algorithm for Multi-Mobile Robot Formation Planning
Previous Article in Journal
Beyond the Metal Flesh: Understanding the Intersection between Bio- and AI Ethics for Robotics in Healthcare
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Telepresence in the Recent Literature with a Focus on Robotic Platforms, Applications and Challenges

College of Engineering and Technology, American University of the Middle East, Egaila 54200, Kuwait
*
Author to whom correspondence should be addressed.
Robotics 2023, 12(4), 111; https://doi.org/10.3390/robotics12040111
Submission received: 21 June 2023 / Revised: 24 July 2023 / Accepted: 24 July 2023 / Published: 1 August 2023
(This article belongs to the Section Educational Robotics)

Abstract

:
Three decades ago, telepresence was presented as an idea in the context of remote work and manipulation. Since then, it has evolved into a field combining different technologies and allowing users to have more or less realistic perceptions of immersion in remote environments. This paper reviews telepresence and its recent advances. While not covering all the work conducted in telepresence, this paper provides an array of applications for which telepresence can be envisioned, providing a clear view of the differences between components and functionalities of robotic platforms conceived for telepresence and pointing to the dependence of telepresence on several technological areas. Furthermore, challenges faced by telepresence technologies are shown, with consideration of user experiences. We consider telepresence from different perspectives, focusing on specific parts, making it possible to foresee future directions of research and applications. This review will be useful for researchers working in telepresence and related fields.

1. Introduction

Presence can be seen from different perspectives. While spatial presence refers to a user’s feeling of being in an environment, social presence refers to the sense of being with one or more other social beings [1]. In a related consideration, telepresence refers to the perception of being present in an environment that is generated by mediated means like an environment that is real but spatially or temporally distant, or non-existent and synthesized by a computer [2,3].
Telepresence relies on different fields of technology and aims to provide persons with the capacity to perceive and/or act in remote environments. The term Telepresence was coined by Marvin Minsky when presenting the idea of remote-controlled mechanical hands allowing a user to work in a remote environment with a feeling of what is happening [4]. More recently., combining sensing, actuation, signal transmission and processing technologies, telepresence systems have proposed with different shapes and features and with different usages. Telepresence systems have been evolving from mobile platforms with signal display and acquisition relative to video calling or video conferencing to platforms with more developed features, allowing users to have immersive perceptions and to manipulate objects in remote environments. Telepresence is networked, requiring transmission of data between distant locations [5]. This differentiates it from virtual presence, which can be local and use technologies of virtual reality (VR) and augmented reality (AR) [5]. Enabling improvements in interactions between persons, telepresence systems involve a main user who perceives the signals captured by the platform and other possible users in interaction with the main user. In a review by Kristoffersson et al. [6], terminology was proposed involving a “pilot user” remotely connecting to a mobile robotic telepresence system and a “local user” located in the same environment as the robot. Other definitions of users have been proposed [7,8], but similar terms don’t necessarily have similar significations.
Telepresence platforms allow users to interact on different levels, from visual and auditory speech-based interaction to tactile interaction and manipulation in the environment. In this context, robots can be of considerable benefit in telepresence, as they can provide functionalities of mobility, manipulation and customizability. Teleoperation is a field of application in which the cognitive skills of humans can be integrated with the physical abilities of a robot, with many applications [9].
Additionally, the functionalities of telepresence platforms enable different levels of presence to be achieved in remote environments. Such presence can be sensed through videoconferencing, where 2D visual information of the remote environment is streamed to the user, or through immersion, whereby users can be represented more realistically [10]. Immersion can also cover groups of large numbers of users, not being limited to one user [11]. In this context, studies have been conducted on the feeling that persons may have of being present in two locations at once [12,13]. The concept of bilocation can extend from self-localization in two different environments to self-identification with another body and reduplication of first-person perspective [14]. Indeed, the human physical body and self are located in the same position in space in daily situations, but self-location can be illusory and affected by senses and feelings like vision and touch [15]. Such considerations can be taken into account in immersive telepresence works addressing embodiment or ownership of a body different from the user’s body [16,17].
Recently, the usefulness of telepresence concepts and systems has emerged in several areas of human–human interaction due to the spread of COVID-19, which pushed the adoption of contactless interaction solutions [18,19]. Telepresence has benefited from advances in social robotics [20,21] and telecommunication technologies but still faces challenges like controllability, stability and autonomy [22]. Furthermore, social presence has been presented as a possible evolution in telepresence where social cues of the persons in interaction are efficiently relayed. Such cues involve eye contact, facial expression, eye gaze, orientation and touch, for example [23]. Social presence has also been considered in the field of remote collaboration, where on-site and remote team members from different backgrounds collaborate in specific activities [24]. It was argued in [24] that social presence alone cannot indicate whether remote collaboration is good or bad, as some aspects of collaboration can be achieved without all collaborators seeing each other at all times.
In the literature, telepresence can be seen to be envisioned for different applications used in different paradigms and relying on different technologies. Among its usages, health care [19,25], care for the elderly [7,8] and education [26] can be mentioned. The usage of telepresence robots allowed for their markets to be extended to hundreds of millions of dollars, with expectations to reach billions of dollars [27,28]. The interactions of telepresence with different aspects of human daily life and its interactions with different fields of technology are various presenting in a wide array of possibilities. It is important to assemble different works in one study to examine their differences and similarities, providing insights on the evolution of telepresence and supporting the envisioning of possible directions of research in the future for different aspects of telepresence.
The goal of this paper is to accomplish these tasks by reviewing works published on telepresence in recent years, including the last decade. Published works are assembled and reviewed and presented in different parts of the paper. Selected works are mainly research papers obtained from databases of peer-reviewed conferences and journals tackling technological fields in relation to telepresence, robotics and communication according to their relevance and date of publication. The main sources of the cited references are journals and conference proceedings included in the IEEE Xplore database. Other related sources are referenced, such as the International Journal of Social Robotics and MDPI journals. The following keywords were used to search for sources: “telepresence”, “teleoperation”, “immersion”, “remote operation”, “telepresence in education”, “telepresence in industry”, etc. This paper cites more than 100 references, more than 80% of which were published after 2012. Among these sources, attention was focused on papers published after 2019. Older references are been considered, either for their high relevance in telepresence or related fields or their definition and usage of important concepts illustrated and mentioned in this paper.
Cited works have been published in more than 60 different sources, including conferences, journals and books. Table 1 shows sources with two or more publications cited in this paper.
This paper is organized as follows. Section 2 shows works categorized into different fields of application of telepresence. Section 3 presents different platforms that have been used for telepresence applications. Section 4 shows the different elements that contribute to the functioning of a telepresence experience. Section 5 shows different challenges facing telepresence works and foresees directions of research and applications for telepresence. Finally, Section 6 concludes the paper.

2. Applications of Telepresence Systems

Although a telepresence system, in general, can be applied in different contexts, some works have designed and evaluated telepresence systems in specific contexts and domains of usage. Telepresence systems can be seen in fields like care and assistance, medicine and education. In this section, works from these and other fields are presented. Figure 1 shows the different fields covered by this review and the number of papers cited in each field. The presented numbers can indicate how easy it was to find references and the amount of work on telepresence that has been conducted in each field. More details about these references are shown in the following subsections. Other references in these fields are also cited in other sections of the paper where appropriate.

2.1. Care and Assistance

In settings of care for the elderly, problems like loneliness may arise, and robotic assistance may be helpful [29]. In the assistance of seniors at home, telepresence platforms require the capacity to navigate and interact safely and with a degree of autonomy [30]. However, despite their possible usefulness, costs of initial setup and maintenance of robots in health care need to be taken into account [31]. The usage of telepresence robots with the elderly was tackled in [8], where the difference between evaluations of social robots in laboratory settings and their real contexts of usage was highlighted. A methodology for the evaluation of a telepresence robot’s usage in supporting social interactions for elderly people was proposed. This methodology introduces variables to take into account in the evaluation of robots. Among these variables, social health and technology impact were considered. The method also includes an assessment of features over time, as robot evaluations could span several months.
Trials of double telepresence robots (see Figure 2) were performed in [7]. Robots were shown to increase the presence of family members with their elderly parents in care facilities. Although this was shown to improve the well-being of the residents, their privacy was reported as a possible challenge, as they were seen to need to have control over accepting or rejecting a call.
A telepresence robot was presented in [32] as a low-cost assistive platform with the ability to help the elderly and caregivers. Tests conducted on the platform showed positive attitudes towards it and a willingness to use it. A wheelchair was also presented in [33] as a telepresence system with potential benefits for individuals with mobility challenges. It was designed to allow for safe navigation and aimed to provide two-way video communication and to allow for social interaction. In addition, a system was presented in [34] that was designed to help people with mobility or speech disabilities to communicate and interact with others. It relies on the Loomo mobile platform (Loomo Personal Robot|Self-Balancing Scooter|Segway Official Store, Available: https://store.segway.com/segway-loomo-mini-transporter-robot-sidekick, accessed on 5 March 2023), text-to-speech and communication technologies and allows users to communicate with others, interact physically and show certain emotions.
The context of the COVID-19 pandemic has enabled and fostered the usage of remote technologies for human interactions, especially the assistance of older adults. The usage of mobile telepresence robots in this context was reviewed in [35]. The study presented evidence that these robots have potential in reducing social isolation in elderly people. In a similar context, the authors of [36] tested specific assistive technologies including telepresence and showed that they were accepted by older adults and professionals in care. The obtained results showed the use of telepresence robots as a means to reduce anxiety in homes and residential facilities. Assistive telepresence systems have not only been used with the elderly but also with people with disabilities. For instance, a telepresence wheelchair was presented in [37] that was intended to provide monitoring and remote assistance.

2.2. Medicine and Health Care

In the field of health care, telehealth has been introduced in the context of providing healthcare services with the use of modern technologies [38], allowing clinics, for instance, to offer virtual visits. While telehealth was initially presented to rely on technologies like online video and phone communication, robots have been shown in works related to health care and medicine. And due to factors like the advancements of artificial intelligence (AI) and automation, telepresence robots have been widely conceived for usage in medicine and health care [19,39]. They can be used in applications like communication between physicians and patients, remote operations like surgery and assistance in daily tasks.
As an application of telepresence robotics in health care, Akibot, was presented in [40] as a system to allow medical doctors to attend to patients while being in remote areas. The platform was equipped with some medical devices for increased utility and consisted of a wheeled mobile platform with an emphatic design. However, it was made to look “as non-human-like as possible”, as a previous study [41] stated that human-likeness leads to expectations of the robot that it cannot provide, leading to less attractiveness and acceptability. The robot interface consisted of a screen displaying information to users in interaction with the platform, and a webcam with a built-in microphone was used for the doctor to see and hear the patient’s side. The system was described as highly maneuverable, and surveys were conducted with users regarding the appearance and the control of the robot and resulted in positive outcomes. The communication between the doctor’s side and the patient’s side was accomplished over the Internet, and the doctor was able to send commands to the platform to perform tasks like moving motors. Patient privacy was enhanced, as no medical device feeds were stored.
Another application for telepresence robotics in medicine is assistance in surgeries. Here, a surgical robotic system can be controlled by a surgeon from a distance, offering significant value to patients [42]. It can also be used to allow a doctor to assess a patient from a remote location in surgical inpatient wards [43]. A study was conducted in [44], hypothesizing a positive view of surgical intensive care unit patients and their families for telepresence robots. The study verified the hypothesis, as survey respondents showed positive perceptions of different aspects of this technology.

2.3. Education

The work reported in [45] in 1997 showed the difference between psychological and physical presence. Indeed, a learner can be present physically but not mentally, as learners need to engage and relate to new concepts being taught. The work envisioned telepresence in education and emphasized the role of the fidelity of the communication medium and its ability to convey non-verbal cues.
In [46], the impacts of telepresence robots used with students missing school because of illness were investigated. The study reported findings from available research, suggesting positive impacts on children with chronic illness. The paper suggested that telepresence robot designs can be improved to maximize outcomes and that training for teachers and planning between stakeholders are essential. In a related work, a telepresence model was presented in [47] for usage in primary education. It was designed for students who cannot attend school. The model was designed to allow for interaction between absent students and their classmates and teachers. The model consists of a human-shaped silhouette with a tablet connected to the student through a web server. A survey of different actors using the tool showed a high approval.
In [48], an analysis of the usage of telepresence robots in higher education was conducted. Notably, the effects of factors like the perceived usefulness and ease of use on the use intention of telepresence robots were observed. The study concluded that usefulness should be prioritized in the design of telepresence robots and that in the design of telepresence robots, complexity and cognitive load demands should be minimized [48]. Another study [49] aimed to explore how personnel in higher education perceived certain aspects of telepresence robots. It showed positive perceptions of telepresence robots and indicated that in comparison to computer-based distance learning, they support the maintenance of social relations between students and teachers. This study also reported data suggesting the need for teacher training before the usage of telepresence robots in education and mentioned the need for more studies to explore the influence of telepresence robots on student learning outcomes and teacher workload. Another application of telepresence robots is foreign language learning, which was addressed in [50], where English learners interacted with a native English speaker through a telepresence robot. After this exploratory study, the potential for telepresence robots in promoting foreign language learning was reported.

2.4. Industry

In industrial environments, telepresence has applications in the operation of systems where robots and humans can interact in the physical world and virtual worlds [51]. A system was proposed in [51] where a remote user was provided with a sense of colocation with a robot through consumer VR systems. This improved the efficiency of the robot in assembly tasks. Robots used in teleoperation can be still or mobile manipulators equipped with scene-sensing devices to transmit information to the human operators. In [52], a mobile manipulator was equipped with an RGB depth camera with the purpose of reconstructing its environment and displaying it to a user with an HMD. The system was designed to allow the user to operate the mobile robot by manipulating a virtual copy of it in a VR environment. This approach ensured low levels of errors in the operation of the robot. Another application for teleoperation is the inspection of potentially dangerous industrial environments by humans. It was addressed in [53], where a robot was equipped to detect gas leaks and to allow users to manipulate objects in the environment from a distant location. The user interface comprised an HMD and a motion tracking system. In a related application, a telepresence system with user viewpoint control was proposed in [54]. It allowed users to observe the environment by moving their heads and to manipulate a robotic arm equipped with a stereo camera.

2.5. Other Applications

An important aspect of the usage of telepresence robots is human–robot interaction (HRI). Indeed, in this context, there is an interaction between the user and the robot and another interaction between persons in the remote environment and the robot. In this context, robomorphism was addressed in [55], where an experiment showed that students interacting with others through a telepresence robot attributed robotic characteristics to their interaction partner. The usage of telepresence robots can also be taken into consideration when assessing the human–human interactions taking place through them. A study [56] showed indications of no difference in perception of human affinity when interaction takes place through a telepresence robot versus in person. This study was conducted in the context of a university building tour guided by a student. In addition to the works shown above, telepresence robots have been conceived and used in various other applications. For example, they can be seen in applications for:
  • Attendance at academic conferences: In [57,58], a study of the use of telepresence robots at conferences was presented. Robots were used in different ways, for example, in dedicated configurations, where each remote conference attendee had their own robot, and in configurations in which robots were shared between multiple people at the same time;
  • Work: In [59], the usage of mobile remote presence systems (MRPs) that remote workers use to drive and communicate in a workplace was surveyed. It was reported that MRPs can support informal communications in distributed teams. However, other questions about dealing with MRPs were raised;
  • Entertainment: A telepresence system for entertainment and meetings was presented in [60]. It used a microphone array with 3D sound localization, a depth camera and a webcam with a computer and Internet connection. It was presented as a teleimmersive entertaining video-chat application;
  • People with special needs: A review was conducted on the usage of telepresence robots for people with special needs in [61]. The review considered age-related special needs and disability and showed several applications and robots but concluded that there are still barriers for people with auditory or verbal disabilities. The review also pointed to the lack of clarity of the impact of telepresence robots on quality of life;
  • Virtual tours: Another application for telepresence systems is allowing users to take tours in remote environments. For example, a system called “Virtual Tour” was presented in [62], which consists of a 360° camera and an audio system capturing a remote environment. Captured signals are streamed to the user side, with the user equipped with a VR HMD. In a related application, HRI for telepresence robots was addressed in [63], where a user interface for a telepresence robot was used to visit a remote art gallery. It targeted residents of healthcare facilities and showed their ability to operate a telepresence robot.

3. Telepresence Platforms

In the different surveyed works, different platforms for telepresence were employed, with different configurations and sets of functionalities. In many cases, the platforms consisted of still or mobile robots with image and sound acquisition capacities from the remote side and of display devices like monitors and loudspeakers or head-mounted displays (HMDs) on the user side. Other designs also exist and offer several degrees of immersion and presence to users. Existing robotic platforms designed for human–robot interaction can also be used for telepresence, such as the platform presented in [64]. Aldebaran’s Nao (Nao-ROBOTS: Your Guide to the World of Robotics, Available: https://robotsguide.com/robots/nao/, accessed on 21 June 2023) and Pepper (Pepper-ROBOTS: Your Guide to the World of Robotics, Available: https://robotsguide.com/robots/pepper/, accessed on 21 June 2023) robots have features that allow for their use in telepresence applications [65,66] (see Figure 3).
Table 2 and the following sections present a number of these platforms and their characteristics and usages. While not providing an exhaustive list of existing platforms, we provide an overview of the different possibilities that exist to enable telepresence experiences for users.

3.1. Double

The Double robot (Double-ROBOTS: Your Guide to the World of Robotics, Available: https://robots.ieee.org/robots/double/, accessed on 14 March 2022) was used in [43]. It consists of a wheeled mobile platform equipped with sensory and motor capabilities, allowing it to navigate and avoid obstacles. It has an adjustable height and provides users with videoconferencing functionalities.

3.2. Immersive Telepresence System [70]

An immersive telepresence system was presented in [70] as a device using an omnidirectional panoptic camera in combination with a VR HMD connected to a client computer. A panoptic camera is a device capable of sensing light from any direction around its center. It consists of 49 cameras with signals exploited to obtain an omnidirectional image transmitted to the client PC, where left and right eye views are generated based on the head orientation and sent to a headset to render a 3D view. The platform is not mobile, and the client side consists of a headset. Proposed applications are in robotic telepresence systems and virtual tourism.

3.3. 3DMVIS

A 3D multimodal visual immersive system (3DMVIS) was presented in [78] as a system allowing for the display of 3D computer-generated and real-world videos, taking into account the head orientation of the user. It consists of a lens module embedded in a head-mounted display, along with an LCD display and a sensing module that tracks head movements. Although this is not a telepresence system itself, a telepresence system was proposed as a combination of the 3DMVIS system, a panoramic camera system and a data processing system, allowing for the delivery of the appropriate images to the eyes of the user. No mobile platform was mentioned, except that the panoramic camera unit was handheld and mobile. A proposed application for this system is house inspection for customers interested in houses located in other cities or countries.

3.4. RDW Telepresence Systems

In [68], a “redirected walking” (see source [2] for explanation) telepresence platform was proposed. It is based on a 360-degree camera and a head-mounted display. The camera is mounted on a mobile robot. A control mechanism was implemented to allow the user to control the mobile robot’s motion, and the user’s walking can be used for this. A latency of image update and movement control was reported.

3.5. Highly Immersive Telepresence [18]

In [18], the difficulty of hosting avatar technology on user devices was reported and was said to be due to large data and computation requirements. A design was proposed for a telepresence system and was made suitable for a multiaccess edge computing environment, which can provide large processing power to manage avatars with low latency.

3.6. Immersive Telepresence with Mobile and Wearable Devices [69]

A telepresence system was presented in [69]. Users were defined as “local” and “remote”, where the local user shares visual data of their surroundings with the remote user. These data are captured using either the user’s mobile device camera or a panoramic camera. The two users can communicate through voice and gestures. Gestures of the remote user are sent to the local user so they can be spatially mapped onto the physical environment.

3.7. Collaborative Control in a Telepresence System [67]

A telepresence system was presented in [67] where sound source localization and user intention were used in a collaborative exploitation to control the motion of a mobile robot to track speakers in interaction with the user. The platform transmits 3D visual data from an RGB-D camera to the user equipped with a VR headset and an IMU for orientation tracking. The platform also has a microphone array, a screen and a computer mounted on a Pioneer mobile robot base.

3.8. Improving the Comfort of Telepresence Communication [79]

The authors of [79] investigated the unpleasantness that can be felt by persons while using telepresence devices. Certain sound stimuli were seen to be unpleasant by users, such as loud voices and breathing into the mic. A system was proposed for the detection and avoidance of such stimuli, relying on machine learning systems.
The telepresence device used in this study consisted of a handheld robot controlled by a Raspberry Pi with components like a binaural microphone, a camera module, a display and an audio interface.

3.9. Beaming System [73]

In [73], a system was presented with the name “Beaming”. It relies on a VR system with 3D visuals and audio, tactile and haptic systems and biosensing. The term “transporter” was used for this VR system, and “visitor” was used to denote the user of the transporter. The system allows for the capture of the environment containing the transporter and displays it to the visitor, simultaneously capturing the visitor and displaying him to the persons in the environment.

3.10. Multi-Destination Beaming [16]

The concept of beaming, as presented in [73] as the name of a project, has been used in other studies, such as [16]. In this work, the concept of multiple simultaneous remote destinations was introduced. A participant would have the ability to switch between various possible bodies in different environments instantaneously, with the illusion of ownership over each of these bodies.

3.11. Geocaching with a Beam [75]

In [75], a study was conducted to investigate the usage of a telepresence robot in geocaching. The Beam+ robot (Beam-ROBOTS: Your Guide to the World of Robotics, Available online: https://robots.ieee.org/robots/beam/, accessed on 14 March 2022) allows a person to geocache with a remote partner. This activity involves several activities, like walking, conversing and looking for objects. The study showed that embodiment in the form of a robot and the mobility of the robot allowed participants to feel strongly present in their remote space. However, the authors mentioned the limitations of this experience, as remote users did not have a full sensation of the space, as components like smells and wind were missing.

3.12. Bidirectional Telepresence Robots [76]

Collaboration between persons at different locations was addressed in [76] in situation where these persons are required to know each other’s position and orientation. A setup was proposed where two telepresence robots were used in two different sites by two collaborators to achieve immersion and experience a first-person-view video to grasp each other’s distance and orientation. This approach was shown to be promising.

3.13. Telesuit

A telesuit was proposed in [77] as a control system for a humanoid telepresence robot. It consists of a suit equipped with sensing elements used for tracking and relaying the user’s movements to a humanoid robotic avatar. In the opposite direction, video signals are transmitted from the robotic avatar to be projected to the user through a head-mounted display.
Based on the platforms shown in this section, it is possible to see the differences among concepts and objectives addressed by various works in telepresence. While some rely on existing humanoid robots and exploit their functionalities for telepresence, others use non-humanoid platforms. While the focus of some works was on the user perception of the remote environment, others focused more on the mobility and maneuverability of the platforms. Additionally, while some are fully controlled by the users, others are designed to have degrees of autonomy in their motions. It can be seen that in comparison with each other, each of these works has its limitations, as it cannot provide all the functionalities at the same time. Challenges remain to be tackled in this field, as discussed in Section 5.

4. Components of a Telepresence System

To be able to provide users with perceptions corresponding to presence in other environments, telepresence platforms such as those shown in Section 3 require different functionalities in relation to the tasks they perform and the degrees of immersion they provide. For immersive perceptions, this ranges from audio, video and other sensory signal acquisition, to processing, transmission, reception and display. Similar steps are involved in the transmission of information from the user side to the platform side if the system allows for interaction between the user and the remote environment. Figure 4 shows the landscape of the technologies involved in telepresence systems. A telepresence platform does not have to be equipped with all these technologies, as, for example, some are not mobile, and others differ in their motion control.

4.1. Signal Acquisition

Signals transmitted from a telepresence platform to a user mainly contain visual and auditory information but can also contain other information depending on the tasks performed by the platform, like haptic data in teleaction systems [80,81]. In both auditory and visual information, different factors are involved, such as the number and resolutions of channels used, among other characteristics. Indeed, visual information displayed to users can rely on one channel, through a display screen for example, or on two channels with head-mounted displays. Similarly, auditory information can rely on one or more channels emitted through one or more loudspeakers. Different works have addressed the signal acquisition task with the aim of improving its applicability in telepresence. The resolution of the cameras used in telepresence systems was addressed in [82]. The tradeoff between increasing the resolution to levels suitable for th human eye and reducing it to decrease the streaming bandwidth was shown. A camera system was proposed that consisted of an omnidirectional camera and a pan–tilt–zoom camera used to increase the resolution for the user region of interest in the available bandwidth range. In this system, the orientation and zoom level of the camera are controlled by the user with a head-mounted display.
A study was conducted in [83] to investigate the effect of the orientation of a camera used in a telepresence application and its field of view on the interaction between users. Giraff telepresence systems were used, and the authors suggested that limiting the field of view of the camera can enhance the interaction, which does not support the assumption that widening the angle of view of the camera is best.

4.2. Signal Transmission

In many cases, signals acquired in a telepresence application require real-time streaming with high signal-to-noise ratios. Telepresence systems highly depend on signal transmission media, with certain applications critically requiring firm real-time operation. To this end, several works have attempted to increase the reliability of transmission media for telepresence in particular. For example, the development of 5G technologies has provided a flexibility that can enable procedures like remote and robot-assisted surgeries [84,85].
In [86], the possibility of using light fidelity technology was tested in telepresence robots. It was mentioned that using Li-Fi, data can be transmitted up to 100 times faster than using Wi-Fi. In the study reported in [75], a mobile hot spot with 4G/LTE technology and data rates of 100 Mbps was used. The transmission of data over long distances was addressed in [87], where the use of it both Wi-Fi and Li-Fi communications was proposed. WebRTC (webRTC, Available: https://webrtc.org/, accessed on 14 March 2023) has also been relied on in several previous works [33,47]. It enables voice and video communications and streaming in real time over the Web.

4.3. Signal Output

Another component of a telepresence application is the delivery of transmitted signals to the user, which can be achieved using a screen and a loudspeaker like in Beam, Ava (Ava-ROBOTS: Your Guide to the World of Robotics, Available: https://robots.ieee.org/robots/ava/, accessed on 14 March 2023) and QB (QB-ROBOTS: Your Guide to the World of Robotics, Available: https://robots.ieee.org/robots/qb/, accessed on 14 March 2023). Other works have used HMDs [67,68,70,88] like Oculus (Oculus Rift S: PC-Powered VR Gaming Headset | Oculus, Available: https://www.oculus.com/rift-s/, accessed on 14 March 2023), and HTC Vive (VIVE - VR Headsets, Games, and Metaverse Life | United States, Available: https://www.vive.com/us/, accessed on 14 March 2023). The difference between monitors and HMDs as output media was studied in [89], specifically in the perception of human-like characters. The study revealed higher levels of immersion felt in virtual environments through HMDs.

4.4. Mobility

The navigation of mobile telepresence robots was addressed in [90], where it was shown that dynamic human-populated environments can be challenging. Another constraining aspect of robot mobility and navigation is the presence of stairs or rough terrain in the areas of operation. This was addressed in [86], where a mechanical design consisting of a robot with both wheels and legs was proposed. The studies conducted in [32] and [30] showed the importance of appropriate robot control in path and motion planning.
Among the robotic platforms used in telepresence, the Pioneer P3-DX (Pioneer 3-DX, Available: https://robots.ros.org/pioneer-3-dx/, accessed on 13 March 2023) (Pioneer 3-ROBOTS: Your Guide to the World of Robotics, Available: https://robots.ieee.org/robots/pioneer/, accessed on 13 March 2023), has been used in studies like [68] and [67]. It is a wheel-based, differential-drive platform designed for autonomous navigation that can be enhanced with more functionalities.The Pioneer 3 robot has a height of around 24 cm. Other robots have been designed to be higher, with limited lengths and widths allowing them to have satisfying mobility and maneuverability capacities. Among these robots, Beam, Ava, QB, Double, VGo (VGo - ROBOTS: Your Guide to the World of Robotics, Available: https://robots.ieee.org/robots/vgo/, accessed on 14 March 2022) and Vita (Vita-ROBOTS: Your Guide to the World of Robotics, Available: https://robots.ieee.org/robots/rpvita/, accessed on 14 March 2022) have different heights, which can exceed 180 cm, like QB, which has an adjustable height with a minimum of 76 cm.

4.5. Motion Control

Another aspect of robotic platforms with degrees of freedom allowing them to perform actions is motion control. These actions can range from displaying emotions to manipulation in the environment where they are located. During social interaction, the remotely present person may have reactions that can be translated into movements of the telepresence platform. These movements can be controlled by the user or performed autonomously by the platform. Autonomous movements for telepresence robots were addressed in [79], where the perspectives of local and remote users and their impressions were taken into consideration. Movements were classified as voluntary and involuntary, and the authors concluded that automating both types of movements can improve the impression of the users unless collisions happen between the autonomous movements and the remote operation commands. The robot used in this study was a robovie-mR2, which has 18 servomotors, allowing for control of different parts of the face and the rest of the body. Vision and audio sensors were installed on this platform for the purposes of the study.
In [22], an approach for designing and controlling telepresence mobile robots was proposed for social interactions. A list of design features was presented, comprising elements like handling, maneuverability, controllability, robustness, user interface and dynamic performance. The safety of humans in the presence of telepresence robots was also addressed, as potential hazards were identified. Such hazards were, for example, the lack of knowledge about the motion path of the robot, failure in controlling software and hardware and access by an unauthorized operator.
One of the components of human–robot interaction and human–human interaction is gaze. Gaze was suggested in [91] as an input method allowing the control of telepresence robots. It was designed to assist persons with disabilities in their daily interactions. A system was proposed to implement this method. It consisted of a virtual reality head-mounted display with eye trackers and a telerobot with a 360-degree camera streaming live video to the head-mounted display.

5. Discussion and Novelties

An analysis and a discussion of important points found throughout the performed review are presented in this section. We also highlight and describe some research gaps observed in some of the reviewed papers.

5.1. Discussion

In the different settings and applications of telepresence, facilitating and challenging factors exist. For example, in aged care, factors like the feeling of physical presence and ease of use are facilitators, while privacy and Internet connectivity are barriers [92]. Similar findings have been reported for telepresence in education [93]. In medicine and specifically in intensive care units, telepresence robots have been found to be advantageous in lowering response times and mortality rates, despite facing the challenges of regulatory and financial barriers [94].
For future work in telepresence, it is possible to envision platforms offering more functionalities and features to users. This can be achieved by stimulating all senses of users and allowing them to perform actions in remote environments. The presence of a user in a remote environment can also be made more realistic through the usage of avatars. This has been proposed in VR with modular codec avatars [95] and in mixed reality with full-body avatars, enabling interaction for users [96]. Extension of telepresence technology was proposed in [97] as a concept allowing for a new means of human remote control interfaces working with telepresence technology. In comparison with a standard telepresence setup where videoconferencing is employed, the extension was proposed to consist of the following points.
  • Equipping a mobile robot with a camera, microphone and speaker on its head and equipping the user with a VR headset with earphones, a microphone and a remote controller for speed and direction;
  • Equipping a robot with an arm and the user with a remote controller with the aim of enabling the ability to grab physical objects;
  • Using other devices like a glove with finger sensors and a body suit with joint sensors to improve the body control of a robot.
With the development of manufacturing, sensing and transmission technologies, the above can be accomplished relatively easily, despite the challenge of user satisfaction. For instance, the robotic platform presented in [98] and shown in Figure 5 can be equipped with visual and sound-sensing capacities corresponding to a telepresence applications. Additionally, it can be mounted on a mobile platform and augmented with actuation capabilities, allowing the remote user to control its actions, like arm manipulation and head movements. Such a platform can be used for several applications, notably in education as an interactive multimodal tool. Additionally, it is possible to exploit a single telepresence platform for used by different users. While some works were presented in which a person can be present in multiple places at the same time, other work has been conducted in which multiple persons can access the same remote location through the same platform. This is referred to as “multipresence” [99].
The future of telepresence work and applications can be envisioned in line with the developments taking place in VR, AR and mixed reality (MR). The technologies used in telepresence are closely interleaved with VR, AR and MR. In this context, AR and MR can be used to support collaborative work [100], with applications in telemedicine, teleducation [101,102,103] and codesign in manufacturing [104]. Concepts like local and remote users are also clearly present in remote collaboration, with social and copresence factors affecting the user experience [105]. It was shown in [101] that telepresence robots can be used to address the limitations of AR/MR-based remote collaborative systems, which offer limited telepresence and naturalness, along with poor user experiences. The olfactory sense and haptics can also improve user experiences in telepresence, as in remote collaboration [105,106]. Regarding the visual modality of telepresence, it was reported in [102] that although immersive technologies have been previously shown to potentially enhance the human perception of 3D data, they are not always better than traditional workstations. This raises questions about the identification of correct research directions to improve user experiences and the functionalities of telepresence.

5.2. Novelties

Other challenges facing telepresence robot designers are shown here, along with some identified gaps that constitute points to take into consideration when working on telepresence applications. Importantly, attention must be focused on assessing different aspects of user experiences of telepresence systems. Indeed, a large number of publications do not take into account some of these aspects, which constitutes a gap that can reduce the impact that telepresence systems are intended to have.
Interaction in social mobile telepresence can be evaluated in terms of quality and in comparison with interaction between persons. This was the topic of the study reported in [107], where several tools were used. A questionnaire that assesses the perceived presence and ease of use was implemented, and theories about spatial formations and their influence on the quality of the interaction were established. Aside from the aspects of immersion, telepresence media can be associated with user experiences like VR sickness and oscillopsia. VR sickness is associated with symptoms like disorientation and nausea and is caused by factors like hardware (e.g., field of view and latency) and content (e.g., graphic realism and duration) [108]. Oscillopsia is associated with an instability in the visual world of an observer [109,110,111] and can be seen in applications where a camera transmitting visual information to an observer has to adjust its orientation to follow the observer [70]. Studies have been conducted on ways to reduce these effects [112,113], and they are important to take into account when designing immersive telepresence systems.

6. Conclusion and Future Work

This paper reviewed the current status of telepresence by going over different applications, developed platforms and technologies. While great potential for telepresence is witnessed and its benefits have been demonstrated in different applications, several challenges remain to be faced before it becomes well-accepted and regulated. The dependability of telepresence platforms also relies on other fields of research and technology, and advances in robotics, VR, AI, signal transmission and other fields can push telepresence forward and in further applications.
The content of this paper may be of interest to researchers already working in telepresence or new to the field. The surveyed work is not exhaustive, and the purpose was to gather relevant sources related to the different topics addressed in this paper so as to provide a clear view of recent advances. While some relevant sources may have been missed, the reviewed sources provide insights on telepresence, supporting future work, as it shown next.
Among the assessed works, it is noticeable that there is no unique definition for telepresence itself, along with concepts like immersion. In some works, telepresence can be achieved through screen- and loudspeaker-based video conferencing, while in others, it is achieved with auditory, visual and even tactile stimuli perceived by users through head-mounted displays and other means. Telepresence platforms can be still or mobile, and when mobile, they have different degrees of autonomy. Moreover, to achieve immersion, displayed signals can be obtained through different means, allowing for different levels of 3D or 2D perceptions of environments. While some works developed ad hoc platforms, others relied on existing platforms in their telepresence applications. Whether or not complete immersive telepresence can be achieved is related to several factors, with a high level of dependency on advances in signal acquisition, transmission and display technologies. Still, the meaning of the term complete here varies among researchers and applications.
In the future, this review will be extended to cover different areas, notably:
  • A separate focus on the different applications mentioned in this paper;
  • User acceptability of telepresence systems and ways to evaluate the user perceptions of telepresence systems they use. This can be studied as a function of each application and can be achieved with questionnaires. A review of user acceptability of VR and AR systems can also be of importance in this field;
  • An important aspect to study deeper is the relation of telepresence with user experiences that can have negative effects on users, such as VR sickness and oscillopsia. This is an important question to address when designing telepresence systems, and a clear understanding of this topic must be obtained.

Author Contributions

Conceptualization, K.Y., S.S., S.A.K. and T.B.; methodology, K.Y., S.S., S.A.K. and T.B.; software, K.Y. and S.S.; validation, S.S., S.A.K. and T.B.; formal analysis, K.Y.; investigation, K.Y. and S.S.; resources, K.Y., S.S., S.A.K. and T.B.; data curation, K.Y. and S.S.; writing—original draft preparation, K.Y. and S.S.; writing—review and editing, K.Y., S.S., S.A.K. and T.B.; visualization, K.Y. and S.S.; supervision, S.A.K. and T.B.; project administration, S.A.K. and T.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors thank the anonymous reviewers for their comments and suggestions that helped to improve the information quality and organization.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hartmann, T.; Klimmt, C.; Vorderer, P. Telepresence and entertainment. In Immersed in Media. Telepresence in Everyday Life; Bracken, C., Skalski, P., Eds.; Routledge: Milton Park, UK, 2010; pp. 137–157. [Google Scholar]
  2. Draper, J.V.; Kaber, D.B.; Usher, J.M. Telepresence. Hum. Factors 1998, 40, 354–375. [Google Scholar] [CrossRef] [PubMed]
  3. Steuer, J. Defining Virtual Reality: Dimensions Determining Telepresence. J. Commun. 1992, 42, 73–93. [Google Scholar] [CrossRef]
  4. Minsky, M. Telepresence. OMNI, June 1980; pp. 44–52. [Google Scholar]
  5. Furht, B. (Ed.) Virtual Presence. In Encyclopedia of Multimedia; Springer: Boston, MA, USA, 2008; pp. 967–968. [Google Scholar] [CrossRef]
  6. Kristoffersson, A.; Coradeschi, S.; Loutfi, A. A Review of Mobile Robotic Telepresence. Adv. Hum-Comput. Interact. 2013, 2013, 902316. [Google Scholar] [CrossRef] [Green Version]
  7. Niemelä, M.; Van Aerschot, L.; Tammela, A.; Aaltonen, I.; Lammi, H. Towards Ethical Guidelines of Using Telepresence Robots in Residential Care. Int. J. Soc. Robot. 2021, 13, 431–439. [Google Scholar] [CrossRef] [Green Version]
  8. Cesta, A.; Cortellessa, G.; Orlandini, A.; Tiberio, L. Long-Term Evaluation of a Telepresence Robot for the Elderly: Methodology and Ecological Case Study. Int. J. Soc. Robot. 2016, 8, 421–441. [Google Scholar] [CrossRef] [Green Version]
  9. Darvish, K.; Penco, L.; Ramos, J.; Cisneros, R.; Pratt, J.; Yoshida, E.; Ivaldi, S.; Pucci, D. Teleoperation of Humanoid Robots: A Survey. IEEE Trans. Robot. 2023, 39, 1706–1727. [Google Scholar] [CrossRef]
  10. Beck, S. Immersive Telepresence Systems and Technologies. Ph.D. Thesis, Bauhaus-Universität Weimar, Weimar, Germany, 2019. [Google Scholar] [CrossRef]
  11. Stotko, P.; Krumpen, S.; Weinmann, M.; Klein, R. Efficient 3D Reconstruction and Streaming for Group-Scale Multi-Client Live Telepresence. In Proceedings of the 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Singapore, 17–21 October 2019; pp. 19–25. [Google Scholar] [CrossRef] [Green Version]
  12. Wissmath, B.; Weibel, D.; Schmutz, J.; Mast, F.W. Being Present in More Than One Place at a Time? Patterns of Mental Self-Localization. Conscious. Cogn. 2011, 20, 1808–1815. [Google Scholar] [CrossRef]
  13. Gooskens, G. Where Am I? The Problem of Bilocation in Virtual Environments. Postgrad. J. Aesthet. 2010, 7, 13–24. [Google Scholar]
  14. Furlanetto, T.; Bertone, C.; Becchio, C. The bilocated mind: New perspectives on self-localization and self-identification. Front. Hum. Neurosci. 2013, 7, 71. [Google Scholar] [CrossRef] [Green Version]
  15. Lenggenhager, B.; Mouthon, M.; Blanke, O. Spatial aspects of bodily self-consciousness. Conscious. Cogn. 2009, 18, 110–117. [Google Scholar] [CrossRef]
  16. Kishore, S.; Xavier, M.; Bourdin Kreitz, P.; Berkers, K.; Friedman, D.; Slater, M. Multi-Destination Beaming: Apparently Being in Three Places at Once Through Robotic and Virtual Embodiment. Front. Robot. AI 2016, 3, 65. [Google Scholar] [CrossRef] [Green Version]
  17. Petkova, V.; Ehrsson, H. If I Were You: Perceptual Illusion of Body Swapping. PLoS ONE 2008, 3, e3832. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Kim, Y.; Joo, Y.; Cho, H.; Park, I. Highly Immersive Telepresence with Computation Offloading to Multi-Access Edge Computing. In Proceedings of the 2020 International Conference on Information and Communication Technology Convergence (ICTC), Jeju Island, Republic of Korea, 21–23 October 2020; pp. 860–862. [Google Scholar] [CrossRef]
  19. Păvăloiu, I.B.; Vasilățeanu, A.; Popa, R.; Scurtu, D.; Hang, A.; Goga, N. Healthcare Robotic Telepresence. In Proceedings of the 2021 13th International Conference on Electronics, Computers and Artificial Intelligence (ECAI), Pitesti, Romania, 1–3 July 2021; pp. 1–6. [Google Scholar] [CrossRef]
  20. Youssef, K.; Said, S.; Alkork, S.; Beyrouthy, T. Social Robotics in Education: A Survey on Recent Studies and Applications. Int. J. Emerg. Technol. Learn. (IJET) 2023, 18, 67–82. [Google Scholar] [CrossRef]
  21. Youssef, K.; Said, S.; Alkork, S.; Beyrouthy, T. A Survey on Recent Advances in Social Robotics. Robotics 2022, 11, 75. [Google Scholar] [CrossRef]
  22. Belay Tuli, T.; Olana Terefe, T.; Ur Rashid, M.M. Telepresence Mobile Robots Design and Control for Social Interaction. Int. J. Soc. Robot. 2020, 13, 877–886. [Google Scholar] [CrossRef] [PubMed]
  23. Van Erp, J.B.; Sallaberry, C.; Brekelmans, C.; Dresscher, D.; Ter Haar, F.; Englebienne, G.; Van Bruggen, J.; De Greeff, J.; Pereira, L.F.S.; Toet, A.; et al. What Comes After Telepresence? Embodiment, Social Presence and Transporting One’s Functional and Social Self. In Proceedings of the 2022 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Prague, Czech Republic, 9–12 October 2022; pp. 2067–2072. [Google Scholar] [CrossRef]
  24. Marques, B.; Ferreira, C.; Silva, S.; Dias, P.; Santos, B.S. Is social presence (alone) a general predictor for good remote collaboration? comparing video and augmented reality guidance in maintenance procedures. Virtual Real. 2023, 24, 1–4. [Google Scholar] [CrossRef]
  25. Altalbe, A.A.; Khan, M.N.; Tahir, M. Design of a Telepresence Robot to Avoid Obstacles in IoT-Enabled Sustainable Healthcare Systems. Sustainability 2023, 15, 5692. [Google Scholar] [CrossRef]
  26. Velinov, A.; Koceski, S.; Koceska, N. Review of the Usage of Telepresence Robots in Education. Balk. J. Appl. Math. Informat. 2021, 4, 27–40. [Google Scholar] [CrossRef]
  27. Telepresence Robots: Worldwide Markets to 2026 by Type, Component and Application-ResearchAndMarkets.com. Available online: https://apnews.com/article/technology-robotics-4d39044f033e48419591712c12f7ce08 (accessed on 12 June 2023).
  28. Telepresence Robots Market Worth $8 Billion by 2023 Says a New Research at ReportsnReports. Available online: https://www.prnewswire.com/news-releases/telepresence-robots-market-worth-8-billion-by-2023-says-a-new-research-at-reportsnreports-629894233.html (accessed on 12 June 2023).
  29. Smith, C.; Gregorio, M.; Hung, L. Facilitators and barriers to using telepresence robots in aged care settings: A scoping review protocol. BMJ Open 2021, 11, e051769. [Google Scholar] [CrossRef]
  30. Laniel, S.; Létourneau, D.; Grondin, F.; Labbé, M.; Ferland, F.; Michaud, F. Toward enhancing the autonomy of a telepresence mobile robot for remote home care assistance. Paladyn J. Behav. Robot. 2021, 12, 214–237. [Google Scholar] [CrossRef]
  31. Introduction|Telepresence in the Healthcare Setting. Available online: https://mypages.unh.edu/telepresence/introduction (accessed on 23 February 2023).
  32. Koceska, N.; Koceski, S.; Beomonte Zobel, P.; Trajkovik, V.; Garcia, N. A Telemedicine Robot System for Assisted and Independent Living. Sensors 2019, 19, 834. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  33. Ha, V.K.L.; Chai, R.; Nguyen, H.T. A Telepresence Wheelchair with 360-Degree Vision Using WebRTC. Appl. Sci. 2020, 10, 369. [Google Scholar] [CrossRef] [Green Version]
  34. Elgibreen, H.; Ali, G.; AlMegren, R.; AlEid, R.; AlQahtani, S. Telepresence Robot System for People with Speech or Mobility Disabilities. Sensors 2022, 22, 8746. [Google Scholar] [CrossRef] [PubMed]
  35. Isabet, B.; Pino, M.; Lewis, M.; Benveniste, S.; Rigaud, A.S. Social Telepresence Robots: A Narrative Review of Experiments Involving Older Adults before and during the COVID-19 Pandemic. Int. J. Environ. Res. Public Health 2021, 18, 3597. [Google Scholar] [CrossRef] [PubMed]
  36. Fiorini, L.; Rovini, E.; Russo, S.; Toccafondi, L.; D’Onofrio, G.; Cornacchia Loizzo, F.; Bonaccorsi, M.; Giuliani, F.; Vignani, G.; Sancarlo, D.; et al. On the Use of Assistive Technology during the COVID-19 Outbreak: Results and Lessons Learned from Pilot Studies. Sensors 2022, 22, 6631. [Google Scholar] [CrossRef] [PubMed]
  37. Ha, V.K.L.; Nguyen, T.N.; Nguyen, H.T. Real-time transmission of panoramic images for a telepresence wheelchair. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 3565–3568. [Google Scholar] [CrossRef]
  38. Telehealth: Technology Meets Healthcare-Mayo Clinic. Available online: https://www.mayoclinic.org/healthy-lifestyle/consumer-health/in-depth/telehealth/art-20044878 (accessed on 23 February 2023).
  39. Catania, L.J. 6-Current AI applications in medical therapies and services. In Foundations of Artificial Intelligence in Healthcare and Bioscience; Catania, L.J., Ed.; Academic Press: Cambridge, MA, USA, 2021; pp. 199–291. [Google Scholar] [CrossRef]
  40. Carranza, K.; Day, N.; Lin, L.; Ponce, A.; Reyes, W.; Abad, A.; Baldovino, R. Akibot: A Telepresence Robot for Medical Teleconsultation. In Proceedings of the 2018 IEEE 10th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment and Management (HNICEM), Baguio City, PH, USA, 29 November–2 December 2018; pp. 1–4. [Google Scholar] [CrossRef]
  41. Leite, I.; Castellano, G.; Pereira, A.; Martinho, C.; Paiva, A. Long-Term Interactions with Empathic Robots: Evaluating Perceived Support in Children. In Social Robotics; Ge, S.S., Khatib, O., Cabibihan, J.J., Simmons, R., Williams, M.A., Eds.; Springer: Berlin/Heidelberg, Germany, 2012; pp. 298–307. [Google Scholar]
  42. Anvari, M. Robot-Assisted Remote Telepresence Surgery. Semin. Laparosc. Surg. 2004, 11, 123–128. [Google Scholar] [CrossRef] [PubMed]
  43. Croghan, S.; Carroll, P.; Reade, S.; Gillis, A.; Ridgway, P. Robot Assisted Surgical Ward Rounds: Virtually Always There. J. Innov. Health Informat. 2018, 25, 041. [Google Scholar] [CrossRef] [Green Version]
  44. Sucher, J.; Todd, S.; Jones, S.; Throckmorton, T.; Turner, K.; Moore, F. Robotic telepresence: A helpful adjunct that is viewed favorably by critically ill surgical patients. Am. J. Surg. 2011, 202, 843–847. [Google Scholar] [CrossRef]
  45. Fowler, C.; Mayes, J. Applying Telepresence to Education. BT Technol. J. 1997, 15, 188–195. [Google Scholar] [CrossRef]
  46. Page, A.; Charteris, J.; Berman, J. Telepresence Robot Use for Children with Chronic Illness in Australian Schools: A Scoping Review and Thematic Analysis. Int. J. Soc. Robot. 2021, 13, 1–13. [Google Scholar] [CrossRef]
  47. Yovera Chavez, D.; Villena Romero, G.; Barrientos Villalta, A.; Cuadros Gálvez, M. Telepresence Technological Model Applied to Primary Education. In Proceedings of the 2020 IEEE XXVII International Conference on Electronics, Electrical Engineering and Computing (INTERCON), Lima, Peru, 3–5 September 2020; pp. 1–4. [Google Scholar] [CrossRef]
  48. Lei, M.; Clemente, I.; Liu, H.; Bell, J. The Acceptance of Telepresence Robots in Higher Education. Int. J. Soc. Robot. 2022, 14, 1–18. [Google Scholar] [CrossRef] [PubMed]
  49. Leoste, J.; Virkus, S.; Talisainen, A.; Tammemäe, K.; Kangur, K.; Petriashvili, I. Higher education personnel’s perceptions about telepresence robots. Front. Robot. AI 2022, 9, 976836. [Google Scholar] [CrossRef] [PubMed]
  50. Liao, J.; Lu, X. Exploring the affordances of telepresence robots in foreign language learning. Lang. Learn. Technol. 2018, 22, 20–32. [Google Scholar]
  51. Lipton, J.; Fay, A.; Rus, D. Baxter’s Homunculus: Virtual Reality Spaces for Teleoperation in Manufacturing. IEEE Robot. Autom. Lett. 2017, 3, 179–186. [Google Scholar] [CrossRef] [Green Version]
  52. Kuo, C.Y.; Huang, C.C.; Tsai, C.H.; Shi, Y.S.; Smith, S. Development of an immersive SLAM-based VR system for teleoperation of a mobile manipulator in an unknown environment. Comput. Ind. 2021, 132, 103502. [Google Scholar] [CrossRef]
  53. Schmidt, L.; Hegenberg, J.; Cramar, L. User studies on teleoperation of robots for plant inspection. Ind. Robot. Int. J. 2014, 41, 6–14. [Google Scholar] [CrossRef]
  54. Luo, L.; Weng, D.; Hao, J.; Tu, Z.; Jiang, H. Viewpoint-Controllable Telepresence: A Robotic-Arm-Based Mixed-Reality Telecollaboration System. Sensors 2023, 23, 4113. [Google Scholar] [CrossRef]
  55. Schouten, A.P.; Portegies, T.C.; Withuis, I.; Willemsen, L.M.; Mazerant-Dubois, K. Robomorphism: Examining the effects of telepresence robots on between-student cooperation. Comput. Hum. Behav. 2022, 126, 106980. [Google Scholar] [CrossRef]
  56. Keller, L.; Pfeffel, K.; Huffstadt, K.; Müller, N.H. Telepresence Robots and Their Impact on Human-Human Interaction. In Proceedings of the Learning and Collaboration Technologies. Human and Technology Ecosystems: 7th International Conference, LCT 2020, Held as Part of the 22nd HCI International Conference, HCII 2020, Copenhagen, Denmark, 19–24 July 2020; Springer International Publishing: Cham, Switzerland, 2020; pp. 448–463. [Google Scholar]
  57. Rae, I.; Neustaedter, C. Robotic Telepresence at Scale. In Proceedings of the 2017 Chi Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; pp. 313–324. [Google Scholar] [CrossRef]
  58. Neustaedter, C.; Singhal, S.; Pan, R.; Heshmat, Y.; Forghani, A.; Tang, J. From Being There to Watching: Shared and Dedicated Telepresence Robot Usage at Academic Conferences. Acm Trans. Comput-Hum. Interact. 2018, 25, 1–39. [Google Scholar] [CrossRef]
  59. Lee, M.K.; Takayama, L. “Now, I have a body”: Uses and social norms for mobile remote presence in the workplace. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 7–12 May 2011; pp. 33–42. [Google Scholar] [CrossRef]
  60. Nguyen, V.; Luo, Z.; Zhao, S.; Vu, T.; Yang, H.; Douglas, J.; Do, M. ITEM: Immersive Telepresence for Entertainment and Meetings-A Practical Approach. IEEE J. Sel. Top. Signal Process. 2014, 9, 546–561. [Google Scholar] [CrossRef]
  61. Zhang, G.; Hansen, J. Telepresence Robots for People with Special Needs: A Systematic Review. Int. J. Hum-Comput. Interact. 2022, 38, 1651–1667. [Google Scholar] [CrossRef]
  62. Kachach, R.; Perez, P.; Villegas, A.; Gonzalez-Sosa, E. Virtual Tour: An Immersive Low Cost Telepresence System. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Atlanta, GA, USA, 22–26 March 2020; pp. 504–506. [Google Scholar] [CrossRef]
  63. Tsui, K.; Dalphond, J.; Brooks, D.; Medvedev, M.; McCann, E.; Allspaw, J.; Kontak, D.; Yanco, H. Accessible Human-Robot Interaction for Telepresence Robots: A Case Study. Paladyn, J. Behav. Robot. 2015, 6, 000010151520150001. [Google Scholar] [CrossRef]
  64. Youssef, K.; Said, S.; Beyrouthy, T.; Alkork, S. A Social Robot with Conversational Capabilities for Visitor Reception: Design and Framework. In Proceedings of the 2021 4th International Conference on Bio-Engineering for Smart Technologies (BioSMART), Paris, France, 8–10 December 2021; pp. 1–4. [Google Scholar] [CrossRef]
  65. Avalos, J.; Cortez, S.; Vasquez, K.; Murray, V.; Ramos, O.E. Telepresence using the kinect sensor and the NAO robot. In Proceedings of the 2016 IEEE 7th Latin American Symposium on Circuits & Systems (LASCAS), Florianopolis, Brazil, 28 February–2 March 2016; pp. 303–306. [Google Scholar] [CrossRef]
  66. Facilitate a Smooth Connection between People with Pepper’s Telepresence Capabilities! Available online: https://www.aldebaran.com/en/pepper-telepresence (accessed on 18 June 2023).
  67. Du, J.; Do, H.; Sheng, W. Human–Robot Collaborative Control in a Virtual-Reality-Based Telepresence System. Int. J. Soc. Robot. 2021, 13, 1–12. [Google Scholar] [CrossRef]
  68. Zhang, J. Extended Abstract: Natural Human-Robot Interaction in Virtual Reality Telepresence Systems. In Proceedings of the 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Reutlingen, Germany, 18–22 March 2018; pp. 812–813. [Google Scholar] [CrossRef]
  69. Young, J.; Langlotz, T.; Cook, M.; Mills, S.; Regenbrecht, H. Immersive Telepresence and Remote Collaboration using Mobile and Wearable Devices. IEEE Trans. Vis. Comput. Graph. 2019, 5, 1908–1918. [Google Scholar] [CrossRef]
  70. Gaemperle, L.; Seyid, K.; Popovic, V.; Leblebici, Y. An Immersive Telepresence System Using a Real-Time Omnidirectional Camera and a Virtual Reality Head-Mounted Display. In Proceedings of the IEEE International Symposium on Multimedia, Taichung, Taiwan, 10–12 December 2015; pp. 175–178. [Google Scholar] [CrossRef]
  71. Osawa, M.; Imai, M. A Robot for Test Bed Aimed at Improving Telepresence System and Evasion from Discomfort Stimuli by Online Learning. Int. J. Soc. Robot. 2020, 12, 187–199. [Google Scholar] [CrossRef] [Green Version]
  72. Matsumura, R.; Shiomi, M.; Nakagawa, K.; Shinozawa, K.; Miyashita, T. A Desktop-Sized Communication Robot: “robovie-mR2”. J. Robot. Mechatronics 2016, 28, 107–108. [Google Scholar] [CrossRef]
  73. Steed, A.; Steptoe, W.; Oyekoya, W.; Pece, F.; Weyrich, T.; Kautz, J.; Friedman, D.; Peer, A.; Solazzi, M.; Tecchia, F.; et al. Beaming: An Asymmetric Telepresence System. IEEE Comput. Graph. Appl. 2012, 32, 10–17. [Google Scholar] [CrossRef] [PubMed]
  74. Jung, M.; Kim, J.; Han, K.; Kim, K. Social Telecommunication Experience with Full-Body Ownership Humanoid Robot. Int. J. Soc. Robot. 2022, 14, 1–14. [Google Scholar] [CrossRef]
  75. Heshmat, Y.; Jones, B.; Xiong, X.; Neustaedter, C.; Tang, A.; Riecke, B.; Yang, L. Geocaching with a Beam: Shared Outdoor Activities through a Telepresence Robot with 360 Degree Viewing. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; pp. 1–13. [Google Scholar] [CrossRef]
  76. Katayama, N.; Inoue, T.; Shigeno, H. Sharing the Positional Relationship with the Bidirectional Telepresence Robots. In Proceedings of the 2018 IEEE 22nd International Conference on Computer Supported Cooperative Work in Design (CSCWD), Nanjing, China, 9–11 May 2018; pp. 325–329. [Google Scholar] [CrossRef]
  77. Cardenas, I.S.; Vitullo, K.A.; Park, M.; Kim, J.H.; Benitez, M.; Chen, C.; Ohrn-McDaniels, L. Telesuit: An Immersive User-Centric Telepresence Control Suit. In Proceedings of the 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Republic of Korea, 11–14 March 2019; pp. 654–655. [Google Scholar] [CrossRef]
  78. Luo, H.; Pan, T.S.; Pan, J.S.; Chu, S.C.; Yang, B. Development of a Three-Dimensional Multimode Visual Immersive System With Applications in Telepresence. IEEE Syst. J. 2015, 4, 2818–2828. [Google Scholar] [CrossRef]
  79. Osawa, M.; Okuoka, K.; Takimoto, Y.; Imai, M. Is Automation Appropriate? Semi-autonomous Telepresence Architecture Focusing on Voluntary and Involuntary Movements. Int. J. Soc. Robot. 2020, 12, 1119–1134. [Google Scholar] [CrossRef] [Green Version]
  80. Hinterseer, P.; Steinbach, E.; Buss, M. A novel, psychophysically motivated transmission approach for haptic data streams in telepresence and teleaction systems. In Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, Singapore, 22–27 May 2005; Volume 2, pp. ii/1097–ii/1100. [Google Scholar] [CrossRef] [Green Version]
  81. Hinterseer, P.; Hirche, S.; Chaudhuri, S.; Steinbach, E.; Buss, M. Perception-Based Data Reduction and Transmission of Haptic Data in Telepresence and Teleaction Systems. IEEE Trans. Signal Process. 2008, 56, 588–597. [Google Scholar] [CrossRef]
  82. Syawaludin, M.F.; Kim, C.; Hwana, J.I. Hybrid Camera System for Telepresence with Foveated Imaging. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 1173–1174. [Google Scholar] [CrossRef]
  83. Kiselev, A.; Kristoffersson, A.; Loutfi, A. The effect of field of view on social interaction in mobile Robotic telepresence systems. In Proceedings of the 2014 ACM/IEEE International Conference on Human-Robot Interaction, Bielefeld, Germany, 3 March 2014; pp. 214–215. [Google Scholar] [CrossRef] [Green Version]
  84. Pandav, K.; Te, A.G.; Tomer, N.; Nair, S.S.; Tewari, A.K. Leveraging 5G technology for robotic surgery and cancer care. Cancer Rep. 2022, 5, e1595. [Google Scholar] [CrossRef]
  85. Qureshi, H.N.; Manalastas, M.; Ijaz, A.; Imran, A.; Liu, Y.; Al Kalaa, M. Communication Requirements in 5G-Enabled Healthcare Applications: Review and Considerations. Healthcare 2022, 10, 293. [Google Scholar] [CrossRef] [PubMed]
  86. Tota, P.; Vaida, M.F. Light Fidelity (Li-Fi) Communications Applied to Telepresence Robotics. In Proceedings of the 2020 21th International Carpathian Control Conference (ICCC), High Tatras, Slovakia, 27–29 October 2020; pp. 1–5. [Google Scholar] [CrossRef]
  87. Ţoţa, P.; Vaida, M.F. Solutions for the design and control of telepresence robots that climb obstacles. In Proceedings of the 2020 IEEE International Conference on Automation, Quality and Testing, Robotics (AQTR), Cluj Napoca, Romania, 19–21 May 2020; pp. 1–6. [Google Scholar] [CrossRef]
  88. Soomro, S.R.; Eldes, O.; Urey, H. Towards Mobile 3D Telepresence Using Head-Worn Devices and Dual-Purpose Screens. In Proceedings of the 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Reutlingen, Germany, 18–22 May 2018; pp. 1–2. [Google Scholar] [CrossRef]
  89. Hepperle, D.; Ödell, H.; Wölfel, M. Differences in the Uncanny Valley between Head-Mounted Displays and Monitors. In Proceedings of the 2020 International Conference on Cyberworlds (CW), Caen, France, 29 September–22 October 2020; pp. 41–48. [Google Scholar] [CrossRef]
  90. Mbanisi, K.; Gennert, M.; Li, Z. SocNavAssist: A Haptic Shared Autonomy Framework for Social Navigation Assistance of Mobile Telepresence Robots. In Proceedings of the 2021 IEEE 2nd International Conference on Human-Machine Systems (ICHMS), Magdeburg, Germany, 8–10 September 2021; pp. 1–3. [Google Scholar] [CrossRef]
  91. Zhang, G.; Hansen, J.P.; Minakata, K.; Alapetite, A.; Wang, Z. Eye-Gaze-Controlled Telepresence Robots for People with Motor Disabilities. In Proceedings of the 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Republic of Korea, 11–14 March 2019; pp. 574–575. [Google Scholar] [CrossRef]
  92. Hung, L.; Wong, J.; Smith, C.; Berndt, A.; Gregorio, M.; Horne, N.; Jackson, L.; Mann, J.; Wada, M.; Young, E. Facilitators and barriers to using telepresence robots in aged care settings: A scoping review. J. Rehabil. Assist. Technol. Eng. 2022, 9, 20556683211072385. [Google Scholar] [CrossRef]
  93. Perifanou, M.; Economides, A.A.; Häfner, P.; Wernbacher, T. Mobile Telepresence Robots in Education: Strengths, Opportunities, Weaknesses, and Challenges. In Proceedings of the Educating for a New Future: Making Sense of Technology-Enhanced Learning Adoption, Toulouse, France, 12–16 September 2022; Springer: Cham, Switzerland, 2022; pp. 573–579. [Google Scholar]
  94. Teng, R.; Ding, Y.; See, K.C. Use of Robots in Critical Care: Systematic Review. J. Med. Internet Res. 2022, 24, e33380. [Google Scholar] [CrossRef]
  95. Chu, H.; Ma, S.; De la Torre, F.; Fidler, S.; Sheikh, Y. Expressive Telepresence via Modular Codec Avatars. In Proceedings of the Computer Vision—ECCV 2020, Milan, Italy, 23–28 August 2020; Vedaldi, A., Bischof, H., Brox, T., Frahm, J.M., Eds.; Springer: Cham, Switzerland, 2020; pp. 330–345. [Google Scholar]
  96. Yoon, L.; Yang, D.; Chung, C.; Lee, S.H. A Full Body Avatar-Based Telepresence System for Dissimilar Spaces. arXiv 2021, arXiv:abs/2103.04380. [Google Scholar]
  97. Cymbalak, D.; Jakab, F.; Szalay, Z.; Turnň, J.; Bilský, E. Extending Telepresence Technology as a Middle Stage between Humans to AI Robots Transition in the Workplace of the Future. In Proceedings of the 2019 17th International Conference on Emerging eLearning Technologies and Applications (ICETA), Virtual, 27–28 September 2019; pp. 133–138. [Google Scholar] [CrossRef]
  98. Said, S.; AlAsfour, G.; Alghannam, F.; Khalaf, S.; Susilo, T.; Prasad, B.; Youssef, K.; Alkork, S.; Beyrouthy, T. Experimental Investigation of an Interactive Animatronic Robotic Head Connected to ChatGPT. In Proceedings of the 2023 5th International Conference on Bio-engineering for Smart Technologies (BioSMART), Paris, France, 7–9 June 2023; pp. 1–4. [Google Scholar] [CrossRef]
  99. Bhattacharyya, A.; Sau, A.; Roychoudhury, R.D.; Banerjee, S.; Sarkar, C.; Pramanick, P.; Ganguly, M.; Bhowmick, B.; Purushothaman, B. Teledrive: An Intelligent Telepresence Solution for “Collaborative Multi-presence” through a Telerobot. In Proceedings of the 2022 14th International Conference on COMmunication Systems & NETworkS (COMSNETS), Bengaluru, India, 3–8 January 2022; pp. 433–435. [Google Scholar] [CrossRef]
  100. Marques, B.; Silva, S.; Alves, J.; Araújo, T.; Dias, P.; Santos, B.S. A Conceptual Model and Taxonomy for Collaborative Augmented Reality. IEEE Trans. Vis. Comput. Graph. 2022, 28, 5113–5133. [Google Scholar] [CrossRef]
  101. Wang, P.; Bai, X.; Billinghurst, M.; Zhang, S.; Zhang, X.; Wang, S.; He, W.; Yan, Y.; Ji, H. AR/MR Remote Collaboration on Physical Tasks: A Review. Robot. Comput-Integr. Manuf. 2021, 72, 102071. [Google Scholar] [CrossRef]
  102. Sereno, M.; Wang, X.; Besançon, L.; McGuffin, M.J.; Isenberg, T. Collaborative Work in Augmented Reality: A Survey. IEEE Trans. Vis. Comput. Graph. 2022, 28, 2530–2549. [Google Scholar] [CrossRef] [PubMed]
  103. De Belen, R.A.J.; Nguyen, H.; Filonik, D.; Del Favero, D.; Bednarz, T. A systematic review of the current state of collaborative mixed reality technologies: 2013–2018. Aims Electron. Electr. Eng. 2019, 3, 181–223. [Google Scholar] [CrossRef]
  104. Wang, P.; Zhang, S.; Billinghurst, M.; Bai, X.; He, W.; Wang, S.; Sun, M.; Zhang, X. A comprehensive survey of AR/MR-based co-design in manufacturing. Eng. Comput. 2020, 36, 1715–1738. [Google Scholar] [CrossRef]
  105. Kim, S.; Billinghurst, M.; Kim, K. Multimodal interfaces and communication cues for remote collaboration. J. Multimodal User Interfaces 2020, 14, 313–319. [Google Scholar] [CrossRef]
  106. Kim, K.; Schubert, R.; Hochreiter, J.; Bruder, G.; Welch, G. Blowing in the wind: Increasing social presence with a virtual human via environmental airflow interaction in mixed reality. Comput. Graph. 2019, 83, 23–32. [Google Scholar] [CrossRef]
  107. Kristoffersson, A.; Eklundh, K.; Loutfi, A. Measuring the Quality of Interaction in Mobile Robotic Telepresence: A Pilot’s Perspective. Int. J. Soc. Robot. 2013, 5, 89–101. [Google Scholar] [CrossRef]
  108. Chang, E.; Kim, H.T.; Yoo, B. Virtual Reality Sickness: A Review of Causes and Measurements. Int. J. Hum-Comput. Interact. 2020, 36, 1658–1682. [Google Scholar] [CrossRef]
  109. Allison, R.; Harris, L.; Jenkin, M.; Jasiobedzka, U.; Zacher, J. Tolerance of temporal delay in virtual environments. In Proceedings of the IEEE Virtual Reality 2001, Yokohama, Japan, 13–17 March 2001; pp. 247–254. [Google Scholar] [CrossRef]
  110. Tilikete, C.; Pisella, L.; Pélisson, D.; Vighetto, A. Oscillopsies: Approches physiopathologique et thérapteutique. Rev. Neurol. 2007, 163, 421–439. [Google Scholar] [CrossRef]
  111. Tilikete, C.; Vighetto, A. Oscillopsia: Causes and management. Curr. Opin. Neurol. 2011, 24, 38–43. [Google Scholar] [CrossRef] [Green Version]
  112. Buhler, H.; Misztal, S.; Schild, J. Reducing VR Sickness Through Peripheral Visual Effects. In Proceedings of the 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Reutlingen, Germany, 18–22 March 2018; pp. 517–519. [Google Scholar] [CrossRef]
  113. Suomalainen, M.; Sakcak, B.; Widagdo, A.; Kalliokoski, J.; Mimnaugh, K.J.; Chambers, A.P.; Ojala, T.; LaValle, S.M. Unwinding Rotations Improves User Comfort with Immersive Telepresence Robots. In Proceedings of the 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Online, 7–10 March 2022; pp. 511–520. [Google Scholar] [CrossRef]
Figure 1. A chart showing the number of references cited in Section 2 according to each field of application of telepresence.
Figure 1. A chart showing the number of references cited in Section 2 according to each field of application of telepresence.
Robotics 12 00111 g001
Figure 2. Two double robots at the American University of the Middle East. The figure shows their screens that can display the remote user. The remote user acquires visual and auditive information from the platform’s environment and can control the motion of the robot through a dedicated interface.
Figure 2. Two double robots at the American University of the Middle East. The figure shows their screens that can display the remote user. The remote user acquires visual and auditive information from the platform’s environment and can control the motion of the robot through a dedicated interface.
Robotics 12 00111 g002
Figure 3. (Left) Pepper robot; (right): Nao robot at the American University of the Middle East. These two humanoid robots have been used in several applications, like social robotics research and telepresence.
Figure 3. (Left) Pepper robot; (right): Nao robot at the American University of the Middle East. These two humanoid robots have been used in several applications, like social robotics research and telepresence.
Robotics 12 00111 g003
Figure 4. Technologies involved in a telepresence system.
Figure 4. Technologies involved in a telepresence system.
Robotics 12 00111 g004
Figure 5. Adam animatronic robotic platform at the American University of the Middle East.
Figure 5. Adam animatronic robotic platform at the American University of the Middle East.
Robotics 12 00111 g005
Table 1. Sources (journals/conferences) of two or more papers cited in the current review.
Table 1. Sources (journals/conferences) of two or more papers cited in the current review.
Journal/ConferenceCited Papers
International Journal of Social Robotics10
Sensors4
IEEE Conference on Virtual Reality and 3D User Interfaces4
IEEE Transactions on Visualization and Computer Graphics3
ACM/IEEE International Conference on Human-Robot Interaction (HRI)3
Paladyn, Journal of Behavioral Robotics2
Consciousness and Cognition2
Frontiers in Robotics and AI2
International Journal of Human-Computer Interaction2
International Conference on Bio-Engineering for Smart Technologies2
Table 2. Examples of systems used in telepresence and some of their characteristics.
Table 2. Examples of systems used in telepresence and some of their characteristics.
SystemRemote SideUser SideOther Features
Virtual-reality-based telepresence system [67]Mobile platform, computer, microphone array, speakers and RGB-D cameraHead-mounted display3D visual data transmission to the user; intentions human head movement used in the control of the mobile platform motion
RDW telepresence systems [68]Mobile platform and 360-degree cameraHead-mounted displayMotion control of the mobile platform using the user’s walking
Framework for immersive telepresence [69]Handheld mobile device and camera or panoramic cameraHead-mounted displayThe user and remote user communicate through voice, and hand gestures of the user can be transmitted to the camera side
Immersive telepresence system [70]Fixed panoptic cameraHead-mounted displayThe user can naturally look around due to the omnidirectionality of the panoptic camera
Akibot [40]Mobile platform with a screen and devices like an otoscope and a stethoscopeA computerDesigned to be maneuverable and used in medical consultation between doctors and patients
Semiautonomous telepresence [71]robovie-mR2 robot [72]A computerThe robot is semiautonomous, automating movements with and without the intention of the user
Beaming system used in [73]VR system with surround visuals and audio and tactile and haptic and biosensing systemsHead-mounted display and motion-tracking suitRecreates a real environment in a virtual model using portable or mobile technical interventions
Beaming system used in [74]NAO V6 robot with two webcamsHead-mounted display and motion-capture systemThe system makes the robot mimic the human user’s movement
Geocaching activity shown in [75]Beam+ robot with a 360-degree cameraSmartphone in a plastic case worn by the user and an iMac computerThe robot is driven by the user using a PlayStation 3 controller and a desktop application
Bidirectional telepresence in [76]Beam+ robot with a 360-degree cameraSmartphone in a plastic case worn by the user and an iMac computerThe robot is driven by the user using a PlayStation 3 controller and a desktop application
Telesuit in [77]A humanoid robotA suit with sensors and a head-mounted displayThe suit is equipped with inertial measurement units and other sensors to capture movements of the operator and monitor his health
Mobile Robotic Presence system in [34]Loomo mobile robotA mobile systemThe system allows for text-to-speech and emoji communication, with audio and video streaming and navigation for mobility
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Youssef, K.; Said, S.; Al Kork, S.; Beyrouthy, T. Telepresence in the Recent Literature with a Focus on Robotic Platforms, Applications and Challenges. Robotics 2023, 12, 111. https://doi.org/10.3390/robotics12040111

AMA Style

Youssef K, Said S, Al Kork S, Beyrouthy T. Telepresence in the Recent Literature with a Focus on Robotic Platforms, Applications and Challenges. Robotics. 2023; 12(4):111. https://doi.org/10.3390/robotics12040111

Chicago/Turabian Style

Youssef, Karim, Sherif Said, Samer Al Kork, and Taha Beyrouthy. 2023. "Telepresence in the Recent Literature with a Focus on Robotic Platforms, Applications and Challenges" Robotics 12, no. 4: 111. https://doi.org/10.3390/robotics12040111

APA Style

Youssef, K., Said, S., Al Kork, S., & Beyrouthy, T. (2023). Telepresence in the Recent Literature with a Focus on Robotic Platforms, Applications and Challenges. Robotics, 12(4), 111. https://doi.org/10.3390/robotics12040111

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop