Abstract
This study investigated how touching and being touched by a humanoid robot affects human physiology, impressions of the interaction, and attitudes towards humanoid robots. 21 healthy adult participants completed a 3 (touch style: touching, being touched, pointing) × 2 (body part: hand vs buttock) within-subject design using a Pepper robot. Skin conductance response (SCR) was measured during each interaction. Perceived impressions of the interaction (i.e., friendliness, comfort, arousal) were measured per questionnaire after each interaction. Participants’ demographics and their attitude towards robots were also considered. We found shorter SCR rise times in the being touched compared to the touching condition, possibly reflecting psychological alertness to the unpredictability of robot-initiated contacts. The hand condition had shorter rise times than the buttock condition. Most participants evaluated the hand condition as most friendly and comfortable and the robot-initiated interactions as most arousing. Interacting with Pepper improved attitudes towards robots. Our findings require future studies with larger samples and improved procedures. They have implications for robot design in all domains involving tactile interactions, such as caring and intimacy.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
1.1 Social touch in Human–Human Interactions
Social touch, as opposed to self-touch, forms an important channel for human non-verbal communication [25, 28]. Compared to verbal and other forms of non-verbal communication (e.g., facial expressions, gesture, posture), touch primarily conveys intimate emotions [1, 26, 59] and thus engenders interpersonal closeness [54]. In our daily life, social touch takes many different forms [44]. For example, shaking hands and kissing during greetings, cuddling and holding hands in intimate interactions, and spanking someone on the bottom as a form of correction. These touches can elicit a wide range of experiences and responses at the physiological and emotional level, each with direct impact on behaviour. A comprehensive overview of the consistent effects of social touch at these different levels has been provided by a number of studies [25, 26, 28, 40]. In the present study, we will first review specific findings on social touch in human–human interactions before looking at related work on human–robot interactions. We will then describe the methods and results of an experiment that recorded both physiological and subjective responses while humans either touched or were touched by a humanoid robot. Results will be discussed with respect to theoretical, methodological and practical implications.
Consider first the physiological level. Receiving social touch has been shown to reduce cortisol levels, the so-called “stress-hormone” [36] and to lower blood pressure and heart rate [21, 32]. Interpersonal touch is thus a method of comforting someone in stress and/or while experiencing negative arousal [19]. For example, apart from being frequently employed in nursing care [6, 8, 37], social touch has been used effectively in a variety of stress-prone situations, ranging from pre-operation [85] and prior to public speaking [18, 32] to being threatened by physical pain [10] or even watching scary videos [48]. In addition to attenuating physiological stress responses, social touch can also evoke higher skin conductance responses (SCRs), an indicator for physiological arousal that reflects increased electrical conductivity of the skin following sweat production. For example, an increased SCR was experienced by individuals when briefly touched by a stranger in passing [83] and by socially-anxious participants touched by an experimenter [86]. These physiological effects elicited by social touch can be interpreted as pleasant or unpleasant, depending critically on the context [7, 11, 13].
Consider now the emotional level of touch interactions. The sense of touch can be used to communicate emotions [38, 39] as well as to elicit emotions [75]. Emotional consequences of touching also depend on the gender combination of the dyad [74] and the sexual attitudes of its members (e.g., [76]). Moreover, social touch has been shown to enhance interpersonal relationships via establishment of trust [1, 26] and can influence people’s (pro)social behaviour, referred to as the “Midas Touch” effect [15]. This effect involves a brief casual touch on the arm or shoulder by a stranger which results in an increased willingness to comply with that stranger’s subsequent request [33, 45, 76]. Social touch in human–human interactions can therefore evoke powerful physiological responses, strengthen social bonds and enhance our prosocial behaviours.
It is the brain’s interpretation of the body’s physiological states that determines our conscious experience of distinct emotions [29]. Together with the existence of specific neurophysiological channels for affective touch (the so-called C-tactile afferents in human skin; cf. [81]), somatosensory representations of emotions in the human body [67], and direct physiological reactions to skin stimulation in early animal research [34, 58], the evidence to date indicates a close link between tactile stimulation, physiological responses and emotional experiences.
1.2 Social touch in Human–Robot Interactions
Nowadays, robots are increasingly stepping into our social and even private lives. Due to their physical embodiment, toy and care robots can be touched and handled by us. Humanoid robots in particular (i.e., robots with their body built to resemble the human body) offer active touching of humans as a unique mode of communication. The United Nations Economic Commission for Europe and the International Federation of Robotics [80] have recognized potential benefits of natural physical contact between people and personal service robots. Given that human social touch is grounded in specific neurophysiological processing channels and intricately related with emotions, it is clear that touch interaction ultimately impacts on our social relationships with and behaviours towards humanoid robots. However, there is so far only a very limited understanding of whether touch interactions with humanoid robots in a social context can elicit physiological responses in human participants that resemble those in human–human interactions. Evidence gathered from this new line of research can ultimately serve as an important first step towards improved human–robot interactions. A brief review of the state of the art follows.
Although research on human–robot social touch interactions is still relatively new, preliminary findings suggest that touch interaction with artificial entities can elicit physiological, emotional and behavioural responses in humans that are similar to those elicited in human–human interactions [23]. Previous studies of touch interactions with human users were implemented to assess affective and social benefits in healthcare [68, 69, 71, 90], education [78] and entertainment settings [27, 84]. For example, elderly individuals touching a pet robot reported improved empathy [69] and people with dementia touching a pet robot reported improved mood [60] and social connection [90]. Moreover, a humanoid robot’s touch (i.e., being touched by a robot) increased motivation to carry out a monotonous task [62, 72] and decreased perceived machine-likeness of the robot in participants with general positive attitude towards robots [12]. It also reduced perceived unfairness of unfair monetary offers proposed by the robot [27].
Previous research shows that touch styles in human–robot interaction influence the interaction outcomes. Hirano et al. [42] found that, in comparison to passively being touched by a robot, people actively touching a humanoid robot perceived the robot as more friendly and the interaction as more comfortable. Furthermore, the gender of the toucher and the receiver [56], as well as people’s attitude towards robots [9, 12, 77], influence the interpretation of touch interactions.
A relatively small set of studies have investigated human physiological responses to either touching the robot [53, 68, 71] or being touched by the robot [9, 87, 88]. In studies where robots initiated a social touch, Willemse et al. [87, 88] compared human physiological responses between a “robot-touch human” condition (robot actively touched participant’s right shoulder and upper arm while it spoke soothing words) and a “robot-not-touch-human” condition (where the robot only moved its head and arms without making physical contact with the participant while it spoke soothing words).
Most of these studies, however, did not distinguish the physiological effects of physical contact with different body parts. For example, whether touching or being touched by the robot on one’s hands elicits a similar response as contact on other body parts is not well understood. In human non-verbal communication, we not only limit our body parts to be touched by others (“body accessibility”; [46]), but in turn also touch only limited body parts of others (“touch zones”; [79]), depending on the nature of our social relationship. The human hand is one of the most frequently touched body parts (a “high accessibility region”; [46]) and one of the regions we most publicly touch on others (the “public touch zone”; [79]), whereas other regions such as the buttock and genital areas fall into the “low body accessibility” or “private touch zone” category [46, 79]. Nevertheless, it is important to investigate how humans respond to touching and being touched in such areas because of the increasingly intimate relationships between humans and humanoids in care and companionship settings. It is plausible to anticipate different physiological responses when physical contacts with these different body regions are made in human–human interactions.
In human–robot social interactions, there are many contexts where physical contacts with different body regions of both humans and robots are unavoidable or even necessary. Examples include using care robots for patient transfer from floor to chair in healthcare [61], early childhood education in the form of playing with children [78] and the emerging and controversial field of intimate relationships with humanoid robots [52, 91]. It becomes increasingly important to collect empirical evidence on this topic in order to help inform the design of social robots in domains where physically contacting different body parts forms an essential part of the human–robot interaction and fundamentally shapes their relationship. This is the reason why, in the present study, we included also a touch condition involving an intimate body part, namely the buttocks. We controlled the robot’s perceived gender to reduce further complications that might arise from the gender combination of the interaction dyads.
Li et al. [53] provided some initial evidence on the physiological responses to touching different body parts of a small humanoid robot Nao (approximately 57 cm tall). The authors categorised 13 body parts of the Nao robot into three groups based on the social construct of “body accessibility” [46]: high accessibility (hand, arm, forehead and neck), medium accessibility (back, ear, eye, foot) and low accessibility (inner thigh, heart, breast, buttocks and genital areaFootnote 1). In an anatomy learning scenario 31 participants were asked to touch these body parts of the robot while their SCRs were measured, with pointing as a control condition. Participants generated significantly (effect size estimate ƞ2 = 0.10) increased SCRs (indicating arousal) while touching low accessible body parts, compared to touching high accessible body parts, but not when merely pointing to the robot. The authors interpreted their results as showing that people treat touching body parts as an act of closeness in itself that does not require a human recipient. Alternatively, the results might also signal that people attribute private bodily zones to the humanoid robot, and such attributions constitute an important aspect of understanding the nature of human–robot interaction (cf. InStance project; [57]). What remains unknown is whether these physiological effects occur when touching a non-toy-sized humanoid robot that resembles more a human in terms of body height. More importantly, it is not known whether similar effects obtain when a robot initiates the touch (i.e., people are touched by a robot rather than actively touching the robot).
1.3 The Current Study
The current study aims to investigate how human–robot touch interaction (both human-initiated and robot-initiated touches) affects human physiology and perceived impressions of the interaction and how it changes attitudes towards humanoid robots. We took inspirations from previous work on human–robot interaction using a small humanoid robot Nao [53]. However, several methodological improvements were made to Li and colleagues’ pioneering study [53] by (a) using a non-toy-sized humanoid robot Pepper (120 cm tall, compared to only 57 cm of the Nao robot), (b) including measurement of participants’ individual differences, such as their sexual orientation and attitude toward robots, both before and after the touch interaction, (c) adding a robot-initiated touch style into the interaction and (d) measuring perceived impressions of the interaction, as well as physiological responses, as an indicator for arousal. Knowing the subjective impression of the touch interactions helps us to interpret the objective physiological response of the skin conductance change. The expected outcome of studying buttocks-related touch was to observe higher skin conductance compared to the hands-related touch conditions. This would generalize the results of Li and Reeves [53] to another robot, to the experience of being touched, and to another intimate body part.
2 Materials and Methods
2.1 Participants
Twenty-one right-handed participants (13 females, mean age = 24.7 years, SD = 4.1) took part in the study. Participants were recruited primarily through a combination of email-based and personal recruitment at two universities. The Ethics Committees at both the University of Applied Sciences, Wildau and the University of Potsdam approved the study. Table 1 presents characteristics of the sample. All participants reported to be fluent in English.
2.2 Study Design
The robot used for this study, Softbank Robotics’ (formerly Aldebaran Robotics) PepperFootnote 2 is a programmable semi-humanoid robot with a height of 120 cm and a weight of 28 kg. Pepper has 20 degrees of freedom of movement and has a head, neck, LED equipped eyes, arms, sensor-equipped hands with five fingers each and visible hips. Its torso ends with a mobile base but it does not have any legs or genitals. There is also a tablet (24.6 × 17.5 cm) with a 10.1″ display attached to the chest of Pepper to facilitate communication with its users. This tablet was turned off during the experiment in order to force participants to focus on verbal communication, which took place in English, throughout the study. Pepper’s eyes were lit to a bright white colour throughout the experiment to help animate the interaction. The right back side of the hip area was pointed out by the experimenter to each participant as the “buttock area” of the robot before the experiment in order to control possible vocabulary differences between participants. Movements for touch interaction and the voice settings (as a male speaker) of the robot were jointly designed and programmed by the first author and the research team from the University of Applied Sciences Wildau. We purposefully identified the robot as male in its introduction monologue (see Procedure below) and checked this manipulation in the post-experiment questionnaire.
The three touch styles included in the study were:
-
actively touching the robot: after the robot’s request to touch either its hand or its buttock, the participant initiates physical contact with the named body part of the robot using his/her right hand;
-
passively being touched by the robot: after the robot seeks permission to touch the participant’s right hand, the robot initiates physical contact using its left hand. For touching the participant’s buttock area, the robot only seeks permission without subsequently touching the participant’s buttock area (this was due to inter-subject physical variability and related problems with programming the robot’s hand trajectory). Thus, this touch condition is clearly referred to as “being touched/asked permission to touch (PTT)” in the subsequent text; and
-
pointing to the robot: after the robot’s request to point to either its hand or its buttock, the participant points to the corresponding body parts of the robot without direct physical contact, using his/her right index finger.
A mutual touch condition, where the person touches the robot and the robot actively returns the touch (such as during a handshake, cf. [2]), was not included in the present exploratory study. To be clear: in this study the human-initiated conditions include participants touching both the robot’s hand and the robot’s buttock whereas the robot-initiated conditions include the participants being touched by the robot on the hand and being asked permission by the robot to touch their buttock area (but without this latter touch event actually occurring).
Participant gender, age, sexual orientation, experience with robots, and attitude towards robots were collected, as described below. Whether participants took medications that were likely to influence skin conductance levels was also ascertained [73]. The main dependent variable was SCR, measured using ShimmerFootnote 3 hardware and iMotionsFootnote 4 software. We also measured perceived friendliness of the robot, comfort and arousal in each condition per questionnaire.
We conducted a 3 (touch style: actively touching vs passively being touched/asked PTT vs pointing) × 2 (body part: hand vs buttock) within-subjects design. Pointing served as control condition without physical contact, as in [53]. Due to resource constraints and the exploratory nature of this study, the within-subjects design aimed to increase the chance of detecting differences among conditions while using fewer participants. To avoid the possible order effect of within-subjects designs, the touch style conditions were counterbalanced, with half of the participants starting with the hand and the other half starting with the buttock condition.
2.3 Materials
The Negative Attitude Towards Robot Scale (NARS; [66]) is widely used in human–robot interaction research [63,64,65, 77] to measure attitudes towards robots. The questionnaire was administered twice, once before and once after the experiment, with essentially the same questionnaire in order to find out if the interaction had any effects on participants’ attitudes (see Appendices 3 and 4). The only differences between the two questionnaire versions were that (a) only the pre-questionnaire included demographics (age, gender, discipline and sexual orientation) and factors that might affect SCRs (e.g., whether medications were taken for cold and for depression in the last seven days) and (b) only the post-questionnaire included measures of perceived impression of the robot (i.e., gender and age of the robot) and of the interaction (i.e., friendliness, comfort and arousal of the interaction). The original English version of the NARS was used with 14 items which consisted of three subscales: negative attitudes toward situations of interaction with robots (e.g., “I would feel paranoid talking with a robot”), negative attitudes toward the social influence of robots (e.g., “Something bad might happen if robots developed into living beings”) and negative attitudes towards emotions in interaction with robots (e.g., “If robots had emotions, I would be able to make friends with them”). The scale has an internal consistency of 0.79, which reflects good reliability. “Appendix 1” lists all variables collected in the study.
2.4 Procedure
The study took place in the library of University of Applied Sciences, Wildau where the Pepper robot is used as a library assistant. The entire experiment, including all robotic instructions, was conducted in English. The whole data collection procedure consisted of three steps: Step 1—experimental setup, consent-taking and pre-experiment questionnaire completion, Step 2—the touch interaction experiment and Step 3—the post-experiment questionnaire administration.
During Step 1, the humanoid robot Pepper was set up in a quiet room in the library, facing the participant. A small tailor-made table was placed between the robot and the chair where the participant was to sit. A human-right-hand outline with palm facing down was marked on the table to ensure each participant could place his/her hand at the same location to be reliably touched by the pre-programmed robot. The chair was adjustable to accommodate different arm lengths of participants for the touching-the-robot condition. After setting up the experiment, informed written consent was obtained, followed by completion of a pre-experiment questionnaire that collected participants’ demographic information (age, gender, discipline, sexual orientation) and their attitude toward robots. Two dry electrodes with a diameter of 8 mm (ShimmerFootnote 5) were then placed on two fingers (index and middle) of the participant’s left hand (i.e., their non-dominant hand). The electrodes were connected via 9-inch leads with the sensor (Shimmer3 GSR + UnitFootnote 6) which was attached to the participant’s left wrist with a wrist strap and provided real-time data collection and wireless data transmission to the computer running the iMotions software. The sampling frequency was 128 Hz. Figure 1 (b, right panel) shows the electrode placement. Consent, questionnaire and electrode placement were completed in a separate room adjacent to the experimental room within a period of approximately eight minutes.
At Step 2, the human–robot touch interaction experiment took place in a quiet room from which the participant could not see outdoor activities. Two researchers monitored the experiment in an adjacent room through a tablet camera placed in the experimental room. The participant was first guided by the experimenter to the experimental room to be seated comfortably facing the robot. The participant was then instructed about the robot’s body parts including the hand and buttock areas. The participant was told to place his/her left hand on the lap when his/her right hand engaged in the touch interaction. The participant was then asked to sit quietly with eyes closed for five minutes for the baseline skin conductance data collection. Please note that all participants completed the touch interaction conditions with their eyes open. At the end of the 5-min baseline period, the robot verbally introduced the experiment (for the monologue text and voice settings used, see “Appendix 2”).
Each condition started with the robot saying “Please touch my hand/touch my buttock/place your hand at the marked area on the table/point at my hand/point at my buttock” and each condition ended with the robot saying “Thank you, please put back your hand”. The durations of touch events and touch intensities varied across conditions and depended on individuals’ compliance and hand anatomy. The only exception was in the condition of being touched by robot on buttock where the robot merely asked permission to touch the participant’s buttock (i.e., ”Is it OK if I touch your buttock area?”), while the act of the robot touching the participant’s buttock area did not actually take place. This was purposefully designed for two reasons. First, there were significant technological challenges to program the robot to reliably and safely touch participants of different heights and of different body shapes. Second, the robot’s asking permission to touch the participant’s buttock area, in other words, human anticipation of touch stimulation, especially on a socially-less accessible body part, is an important psychological process that still needs investigation. For example, recent evidence indicates that anticipation of touch modifies human brain function [55].
The inter-condition interval (i.e., “cool-off” period) was approximately 13 s, which allowed the skin conductance level to return to baseline before the next experimental condition. Figure 1 (a, left panel) shows the condition of being touched by robot on hand. The end of the experiment was marked by the robot saying “Thank you for your participation. You can now leave the room.” The duration of each condition was marked manually on the video stream by the first author during the experiment with one marker for the beginning and one for the end, corresponding to the robot’s statements of “Please” and “Thank you”, respectively. The experiment lasted for approximately nine minutes.
At Step 3, participants completed a post-experiment questionnaire (see Appendix 4) which was the same as the one used in Step 1 but with additional questions about how they perceived the robot gender and age and which condition they rated to be the most/least friendly/comfortable/arousing. Participants were only allowed to choose one condition for each attribute. An informal interview regarding how participants felt about the experiments (e.g., whether they understood the robot’s instructions and questions and whether they answered “yes” in response to the robot seeking permission to touch the buttock area) was also conducted and the answers noted down. Step 3 was completed within approximately six minutes. The entire data collection required approximately 25 min (Table 2).
2.5 Data Preparation and Analysis
Skin conductance, one form of electrodermal activity (a term introduced by Johnson and Lubin [43]), is one of the most commonly used measures of emotional arousal in psychological research [4, 49]. The raw skin conductance signal consists of two components: the skin conductance level (known as the tonic level) and the SCR (known as the phasic response; [17]). The tonic level describes the overall conductivity of the skin over longer time intervals ranging from tens of seconds to minutes, which is not informative with regard to responses to stimuli. The phasic response rides on top of these tonic changes and shows significantly faster alterations, being sensitive to emotionally arousing stimulations that are of interest in the present context. Raw data was therefore prepared in the following three steps before final analyses took place: (1) assigning and verifying experimental conditions by using the live markers placed on the video stream, (2) extracting the phasic response data so that the tonic level was removed from the raw skin conductance signal and (3) applying the moving average method to filter motion-related artefacts from the phasic data.
Once the experimental conditions had been identified, three aspects of the phasic responses among conditions were studied: (1) visualization of patterns of phasic responses for each condition, (2) average phasic response differences measured in microsiemens (µS) and (3) peak differences regarding rise time, amplitude and recovery time. Two-way repeated measures analyses of variance (ANOVAs) were performed on aspects (2) and (3) to investigate main effects of touch style, body part as well as the statistical interaction touch style*body part. Regarding questionnaire data, pre-post responses were compared using a repeated measures t-test (after ensuring that parametric test assumptions were met) to investigate possible effects of the touch interaction on attitude towards robots. Regarding perceptions of the interaction outcomes (i.e., the identification of most/least friendly/comfortable/arousal conditions), rating outcomes were counted and collated.
Analyses were conducted using IBM SPSS StatisticsFootnote 7 and RStudioFootnote 8 (for pattern visualization only). Results from both physiological and questionnaire data were finally compared in light of the sample and design limitations.
3 Results
3.1 Visualization of Patterns of Phasic Responses
Figure 2 visualizes the patterns of phasic responses of one participant over the course of an experiment. We can see from Fig. 2 that this participant’s order of experimental conditions was, after the initial Introduction session, Pointing–Touching–Being touched/asked PTT with buttock area first followed by hand area. Considering their duration, the longest condition was the Introduction (approximately 37 s), followed in duration by the last condition R touches hand (approximately 26 s), while the shortest conditions were Pointing and Touching R buttock (all under 10 s duration). Visual inspection also reveals that the variability in SCRs in the condition R touches hand seems larger than the variability in all other conditions. Visualization of patterns of phasic responses was conducted for each of the 21 participants, which served as an important verification process for the quality of the data collected.
SCRs for each condition were visualized using RStudio (Fig. 3), resulting in 21 lines representing the 21 participants. Visual inspection of the patterns indicated overall larger SCRs in the Touch robot hand condition and highest variability in the Robot touches hand condition. This pattern warranted more detailed statistical investigation in the next stage.
3.2 Differences in Average Phasic Responses
Visual inspection of mean differences in phasic SCRs among different conditions showed that the Touch robot hand condition had the largest level of response (0.0336 µS), Fig. 4). Two-way repeated measures ANOVA explored the main effects of touch style, body part and their interaction in the average phasic SCRs. No significant effects were found for either the main factors or the interaction (details see “Appendix 5”).
3.3 Peak Analysis (Rise Time, Amplitude and Recovery Time)
The most frequently used measure to describe a single electrodermal response (e.g., a single SCR) is its amplitude [4]. For the electrodermal recording method used in our study (exosomatic recording with direct current), a stimulus will, after a response latency, result in an increase in skin conductance. The amplitude of this signal refers to the intensity of a single response (i.e., the amplitude difference between response onset and peak). Two measures frequently used to describe the shape of electrodermal responses are rise time (i.e., the time from response onset to maximum) and recovery time (i.e., the time needed to recover a certain proportion of the amplitude). Rise time is typically shorter than recovery time [4].
The peaks in skin conductance used for our analyses are event-related phasic SCRs (ER-SCRs) as specific responses to the experimental stimuli, which occurred between 1–4 s after condition onset. ER-SCRs differ from spontaneous or non-specific responses (NS-SCRs) that are not consequences of any eliciting stimulus but instead happen in the body at a rate of 1–3 per minute [17]. The peak amplitude typically ranges between 0.1 and 2.0 µS [16]. In our study, the peak onset (> 0.01 µS) and the peak amplitude threshold (0.05 above the onset value) were pre-determined in agreement with standard settings of the iMotions software.
3.3.1 Peak Rise Time
Regarding peak rise time, two-way repeated measures ANOVA showed significant main effects for both touch style and body part. Touch style: F(2, 40) = 3.217, p ≤ 0.050, partial ƞ2 = 0.139, power = 0.582, with the average rise time in the being touched/asked PTT by robot condition (Mean = 1315.814 ms, SE = 114.046) significantly shorter (p ≤ 0.05) than the touching robot condition (Mean = 1895.529 ms, SE = 207.229). Body part: F(1, 20) = 5.384, p ≤ 0.05, partial ƞ2 = 0.212, power = 0.598, with the hand condition (Mean = 1444.750 ms, SE = 118.090) having a shorter rise time than the buttock condition (Mean = 1776.070 ms, SE = 131.990). No significant effect was found for the interaction (details see “Appendix 5”).
3.3.2 Peak Amplitude
Regarding average peak amplitude, the analysis showed a significant main effect for the touch style factor. Touch style: F(2, 40) = 6.319, p ≤ 0.01, partial ƞ2 = 0.240, power = 0.875, with the actively touching condition having larger amplitude (Mean = 0.214 µS, SE = 0.033) than the passively being touched/asked PTT condition (Mean = 0.142 µS, SE = 0.021), p ≤ 0.01). No significant effects were found for either body part or the interaction (details see “Appendix 5”).
3.3.3 Peak Recovery Time
Regarding average peak recovery time, the grand mean was 1863.34 ms (SD = 792.14 ms). No significant effects were found for either the main factors or the interaction (details see “Appendix 5”).
3.4 Changes in Attitude Towards Robots After Interaction
Changes in attitude towards robots after the tactile interaction compared to before the interaction are shown in Table 3. In general, there was no significant difference in the overall attitude toward robots after the approximately nine-minute touch interaction with the Pepper robot (t(20) = -0.987, p > 0.05). However, significantly larger scores (meaning less agreement with statements pertaining to negative attitudes) were obtained for Item 2 after the touch interaction (“Something bad might happen if robots develop into living beings”; t(20) = -2.905, p < 0.01). As Item 2 was part of the subscale for measuring the perceived social influence of robots, this finding indicates that participants might become less negative toward social influences of robots after a tactile interaction with them. Furthermore, Item 8 (“I would feel nervous operating a robot in front of other people”), which belongs to the subscale of situations of interactions with robots, received larger scores (approaching significant difference) after than before the tactile interaction (t (20) = -2.007, p = 0.058). This result also signals that negative attitudes toward situations of interaction with robots might be reduced after the interaction.
3.5 Perception of Robot Characteristics and the Interaction Outcome
Regarding the robot’s age, 57.1% participants (n = 12) considered the Pepper robot as a child, 33.3% (n = 7) as an adolescent, and only 9.5% (n = 2) as an adult. Regarding the robot’s gender, despite introduction of Pepper as a male robot, only 61.9% participants (n = 13) believed the Pepper robot was male, 33.3% (n = 7) were unsure, and 4.8% (n = 1) thought the robot was female. 52.4% (n = 11) based their judgment of robot gender on the robot’s voice, and 19.0% (n = 4) based their judgement on the appearance of the robot, and the remaining 28.6% reported that they based their gender judgement on a combination of appearance, voice and robot’s introduction.
Regarding the perceived impression of interaction outcomes, being touched by robot on hand was rated by most participants (71.4%, n = 15) as the friendliest condition. Pointing to robot buttock was rated as the least friendly condition (38.1%, n = 8). The largest number of people (38.1%, n = 8) rated touching robot on hand as the most comfortable condition, the least comfortable condition was being asked about buttock (38.1%, n = 8). The most arousing conditions were the robot-initiated actions including being touched by robot on hand and being asked by robot if OK to touch buttock (both 23.8%, both n = 5), the least arousing condition was pointing to robot hand (28.6%, n = 6) (Table 4).
Individually conducted informal interviews with participants after the experiment revealed mixed reactions towards the whole experience of touch interactions with humanoid robots. Some participants expressed their positivity (e.g., “I was positively surprised”; “I was curious what the robot was going to do next”; “I felt I was with somebody in the room”) whereas others felt the whole presentation of the robot lacked spontaneity (e.g., “The robot was too rigid”; “The robot acted kind of weird”; “It would be nice if the robot can move the mouth or the eyelids while speaking”). Unfortunately, a few participants (n = 4) reported experiencing some level of difficulty in understanding what the robot was saying, either due to the mechanical voice or the inability to clearly understand the word “buttock”. Concerning the robot seeking permission to touch people’s buttock, 81% (n = 17) said “yes” and 19% (n = 4) were unsure about how to answer this question. For those who were unsure, the common explanation was “It was strange. I was surprised that the robot could ask such a question.” Two participants asked the robot to repeat the question (“Can you please repeat?”), in which case the robot’s pre-programmed answer (i.e., “Thank you for letting me know”) was obviously out of synchronization. A majority of participants (76%, n = 16) reported that they felt a greater sense of arousal when the robot initiated the actions as compared to conditions where they touched the robot.
4 Discussion
4.1 Summary of Results
This study reported preliminary findings of an exploratory experiment detailing how bidirectional social touch (both touching and being touched by a humanoid robot) affected human physiology and subjective impressions of that interaction. The main findings were as follows:
-
1.
Regarding average SCRs, pointing to and touching a humanoid robot, as well as being touched/asked permission to touch by this robot, led to similar physiological activation patterns, with no additional interaction effect of interaction style and body part.
-
2.
Peak analyses of skin conductance levels revealed that
-
(a)
The peak rise time was shorter in the being touched/asked permission to touch by robot condition compared to the touching robot condition, and also in the hand condition compared to the buttock condition.
-
(b)
No reliable differences between conditions were found in the peak recovery times.
-
(c)
The peak amplitude was larger in the touching robot condition compared to the being touched/asked permission to touch by robot condition.
-
(a)
-
3.
Although participants’ overall attitude toward robots did not change after the touch interaction, participants became generally less negative toward certain aspects of robots (Item 2 contributes to the subscale of social influence of robots; and Item 8 contributes to the subscale of situations of interaction with robots) after interacting for approximately 10 min with a robot.
-
4.
Regarding subjective evaluation of the specific robot’s characteristics, a majority of participants (90%, n = 19) perceived the robot either as a child or as an adolescent with only 10% (n = 2) considering the robot as an adult. Despite our explicitly gendered introduction, only 62% (n = 13) thought the robot was a male, 33% (n = 7) were unsure and 5% (n = 1) thought Pepper was a female.
-
5.
Regarding the perceived impression of the interaction outcome, the hand condition was rated by a greater majority of participants as most friendly (n = 15) and comfortable (n = 8), and the robot-initiated conditions (where the robot either touched the human hand or asked permission to touch the human buttock) were considered by a larger majority of participants (n = 5 for each condition) to be most arousing.
-
6.
81% of participants (n = 17) said “yes” to the robot seeking permission to touch their buttocks and 19% (n = 4) were unsure about how to answer the question.
In comparison to the previous report by Li et al. [53], which inspired the present experiment, major methodological differences in our study included adding a robot touching human condition, subjective measures of interaction outcomes as well as using a non-toy-sized humanoid robot (i.e., Pepper as opposed to the much smaller Nao). Considering these methodological differences, it is perhaps not surprising that we obtained noticeable differences between the results observed in this study and those reported by Li et al. [53]. We now consider these results in turn.
First, Li et al. [53] reported that participants experienced a higher SCR when touching low accessible body parts of the Nao robot (e.g., buttock) compared to touching high accessible body parts (e.g., hand), a difference that was not observed when they pointed to the robot. This increased SCR suggested a higher physiological arousal that was interpreted by the authors as experiencing psychological alertness as a result of touching low accessible body parts of a robot, even though the robot was only a machine. The authors did not consider the alternative possibility that humans attribute intentionality and emotions to a humanoid robot, especially when interacting with it on a personal (tactile) level. In the current study, the effect of body part on SCR was not significant (in fact, no reliable effect was found for the other main factor of touch style and the interaction of body part*touch style). It is worth noting that the effect size in Li et al. study [53] was larger (effect size estimate ƞ2 = 0.10) compared to the current study (partial ƞ2 between 0.04 and 0.07) and the observed powers (between 0.190 and 0.307) were likely to be insufficient to detect these small-sized effects. It is therefore possible that the current sample size was too small to detect small-sized effects. According to the statistics software G*Power [24], 42 participants would be needed to detect a moderate main effect and interaction effect (ƞ2 = 0.06) using a repeated measures ANOVA with a statistical significance level of α = 0.05 (two-tailed) and a power of 0.80. Considering that this estimation exceeded our available sample size (n = 21), results must be interpreted in light of these power limitations and the exploratory nature of the study. Apart from refinement of experimental procedures, future studies should use a larger sample size to investigate further the average SCRs among different conditions. Furthermore, the fact that some participants experienced difficulties in comprehending the word “buttock” in a robotic voice, as indicated in the post-experiment informal interview session, may also have worked against inducing reliable effects. Improved language settings or a conversational familiarisation phase are recommended for future studies. Another problem is that the humanoid robot Pepper does not have a distinctively visible buttock per se, which could have impacted on the outcome.
Surprisingly, further peak analyses revealed a shorter peak rise time in the hand condition, compared to the buttock condition, suggesting that participants reacted faster to the hand as opposed to the buttock condition. It should be noted that the hand condition included touching and being touched by the robot on the hand while the buttock condition included touching the robot buttock and merely being asked by the robot if it is OK if it touches the human buttock. This merely potential (not actual) touch event included in the buttock condition may have reduced participants’ arousal level. Moreover, the hand condition was considered by a majority of participants as the most comfortable and friendly condition, perhaps inducing more rapid anticipation of the tactile event. Furthermore, no reliable difference was found in peak amplitudes between the hand and buttock conditions. Although the relationship between peak rise time and peak amplitude is complex, our initial findings on peak rise time with high vs low accessible body parts provide useful information.
Second, adding the robot touching human conditions was intended as an improvement to Li and Reeves’ work [53] so that effects of active vs passive touch styles can be compared. Although the effect of touch style in the average SCRs was not significant, passively being touched/asked PTT by robot had a shorter peak rise time than the active touching robot condition whereas touching robot had a larger peak amplitude than the being touched/asked PTT by robot condition. In the being touched/asked PTT by robot condition, the robot asked permission to carry out an action on the participant whereas in the touching robot condition the robot asked the participant to perform an action on the robot. Thus, one possible explanation of the results might be related to an increased psychological alertness due to the unpredictability of how the robot might carry out an action on the human body in the being touched/asked PTT by robot condition. It is evident in the stress literature that uncontrollability and unpredictability contribute to increased anxiety as measured in skin conductance and self-reported anxiety scores [35]. In our study, the touch force, touch trajectory, and touch speed were all relatively unknown in the robot-initiated conditions (i.e., being touched/asked PTT by robot), compared to the touching robot conditions where people have control over all aspects of the touch action. Furthermore, the qualitative data from the post-experiment informal interview support this line of argument in that 76% of participants experienced a sense of arousal in the robot-initiated conditions (i.e., being touched by the robot on hand and being asked by the robot for permission to touch buttock area).
So what is the main contribution of this work? As far as we are aware, no studies have so far compared human physiological responses (e.g., skin conductance) to different touch styles between “human-touch-robot” and “robot-touch-human” conditions. Li et al. [53] reported a higher SCR in their “human-touch-robot” vs their “human-point to-robot” conditions. Willemse et al. [88] found no differences in SCRs between “robot-touch-human” and “robot-not touch-human” conditions. Both studies compared the mean SCRs between conditions without further peak analyses. The differences in methodology and statistical analytical procedures make it difficult to compare the current findings on physiological effects of touch styles with the available evidence in the literature. The findings from the current study can nevertheless be considered as a valuable contribution to the field. For example, we are the first to study the effect of a robot seeking permission to touch a human, on either a public or an intimate body part (as might become relevant in caring and intimate settings). Future studies are needed to further investigate how different touch styles in human–robot interaction impact on human physiology.
4.2 Theoretical Impact
In human–robot social interactions, social robots typically contribute not only their physical interactions with the user [51], but also their physical embodiment [47] as well as other human-like characteristics [5] such as verbal and non-verbal communication cues to enhance their social presence. In order for embodied artificial agents to be truly perceived as socially present by their human partners, the agents have to be experienced by the human user as “actual social actors in either sensory or non-sensory ways” ([50], p. 45). Therefore, human–robot tactile interactions should be considered as part of the repertoire of social exchanges between humans and the robot, where the meaning of the social touch becomes highly dependent on the accompanying verbal and non-verbal social signals [23]. In our study, the interplay among the various social signals displayed by the Pepper robot was somewhat mis-aligned. On one hand, its anthropomorphic appearance, its ability to speak and its eye animation (they were lit in white colour) during the conversation all likely raised people’s expectation of the social presence of the robot. On the other hand, the actual perception of the robot included its mechanical voice, the metallic feeling of its touch, coupled with a lack of appropriate intonation or facial expressions and mis-timed answers to participants’ questions. All these signals were not in line with participants’ high expectations. Due to the lack of coherent interplay between these social cues, the touch events we studied may have lost their social meaning, hence failing to be effective in producing the anticipated responses. This concern (and the other considerations raised above) leads us to methodological suggestions for future work which will be discussed next.
4.3 Methodological Considerations and Future Work
Research on human–robot social touch is an emerging area of study [23, 53]. In particular, empirical studies on physiological responses to human–robot social touch involving socially low accessible body parts are currently sparse. On the other hand, considering the growing number of applications of social robots in our professional, social and private lives, there is an increasing need to identify the multidimensional factors that could play a role in this interaction [88]. These could include factors relating to the touch event (e.g., its duration, frequency, and body location), factors relating to the robot (its appearance, voice, perceived gender and other social cues), factors relating to the human user (e.g., individual differences and attitudes towards robots), and factors relating to the context (e.g., role of the robot and nature of the relationship). At the moment, no standard research protocols are available. The present study protocol can therefore be considered as a valuable contribution in enhancing our understanding of human–robot social touch in general. Moreover, the study setup also allows for further investigations, either with the same robot and variations in robot voice, touch behaviour and manipulation of other social cues like gaze behaviours, or with different types of robots (cf. [14, 89]). These contributions can be considered as further merits of our work. However, manually demarcating experimental conditions on video streams potentially increased the noise in our data. This aspect can be improved by connecting the robot touch behaviours with the recording software in future studies. Although the robot’s permission-seeking “Is it OK if I touch your buttock?” was intended as an important probe into the human’s psychological anticipation following a socially unusual question, no actual robot-touch-human-buttock event took place in the interaction, even if the participants responded with a “yes”. This setup could have confounded our evaluation of the factor of body part. A useful next stage of such investigations is to program robot-touch-human-buttock behaviour that is sufficiently natural and safe to be included in the interaction.
The study results should be interpreted in light of the following limitations. First, a convenience sampling strategy was used in this study, which limits the generalizability of the study results to other populations. Individual differences were collected in this study including attitude towards robots, a factor previously found to influence people’s perception of the interaction outcome [9, 12]. However, the sample size was not large enough to enable meaningful statistical analyses to compare effects between subgroups such as gender, sexual orientation and people with different attitudes toward robots. This must be left for future studies. Having said this, due to the exploratory nature of the study, the current sampling strategy and sample size are considered to be useful for methodological improvements in the next investigation with a more diverse population and a larger sample size.
Second, the humanoid robot Pepper (vs a small humanoid robot Nao) was purposefully selected because it better resembles human height. It was intended to use its physical presence to enhance its social presence while interacting with human participants. This intended effect is unlikely to emerge when people interact with non-humanoid robots, such as a lifting arm in a care context. However, there seemed to be a discrepancy between the expected and actual perceived social presence of the robot, largely due to mis-aligned social cues. This might have complicated the interpretation of the current results. In future studies, incorporating human characteristics into the embodied agents to enhance their social presence should be attempted with special care for designing human–robot social touch experiments. In particular, we wonder how other communicative cues, such as gaze behaviours, can influence impressions of social touch interactions, as indicated in previous work [42]. Many studies have demonstrated that a higher social presence positively affect people’s perception of a robot [41, 51] as well as the relationship with the robot [31, 70]. In the context of social touch, further investigation of the relationship between social presence and touch responses would be beneficial, for example whether the perceived social presence of the robot is a prerequisite for eliciting human (physiological) responses to touch. Also, to what extent must social presence be experienced by the human user in order to elicit physiological responses? Moreover, the surface material of the robot can influence the touch experience and its evaluation [82] and should be considered in future work.
Third, the context of the touch interaction in our study was a lab-based environment, where the role of the robot was not specified. Thus, the intended function of the touch was open for the participants’ interpretations. This and other contextual factors will influence people’s responses to the touch. Chen et al. [9] found that perception of robot touch (both enjoyability and favourability) was dependent on the robot’s verbal warning and the function of the touch. Dougherty et al. [20] explored how touch may be used to induce trust in an android-based business case scenario. Touch is an often overlooked channel of communication and sometimes even more powerful and persuasive than language. Future studies can benefit from investigating further how the perceived function of touch might influence physiological responses. Moreover, perceived gender and age of the robot could also impact on the physiological responses. Generally speaking, the design of future studies in the area of human–robot social touch interactions should take contextual and individual difference factors into consideration [22, 23]. One example of such work is facial character analysis [30]. Clearly, humans’ attitudes towards humanoid robots and their resulting emotions will influence their interpretation of tactile interactions.
Fourth, some specific parameters of touch, for example, touch force [3] and touch amount [78] that were previously found to be important factors, were not investigated in the present study. Furthermore, the physical qualities of human and robot touch also differ [28], which was also not addressed in this exploratory study. In future studies, it will be helpful to explore the relative roles of these parameters in the human–robot social touch interaction and how they might impact on human physiological responses. Finally, a real control condition with human–human touch interaction should be incorporated into the design of future studies to provide a reference for the interpretation of effects.
4.4 Practical Implications
Perhaps the most important message from these initial findings is that human physiological responses to bidirectional social touch interactions with a humanoid robot can be reliably measured in skin conductance. Despite the challenges in making coherent interpretations of the current findings in light of the available evidence in the literature, these preliminary findings can be seen as a cautious message to those who work in robot design and in applications of social robots to take account of possible physiological responses in humans. This is particularly relevant to domains such as healthcare, education and entertainment where human–robot physical interactions will invariably involve different body parts and where quality of care may be improved by optimizing touch experiences.
5 Conclusions
This study investigated physiological as well as subjective responses to touch interactions with a humanoid robot. We found that humans systematically responded to such tactile stimulation with changes in skin conductance, a measure that signals emotional arousal and engagement. Our exploratory study is the first to report both physiological and psychological consequences of a robot seeking permission to touch a human on an intimate body part. Further work on social touch between humans and humanoids can have implications for robot design in all domains involving tactile interactions, such as caring and intimacy.
6 Appendix 1: All Variables Collected in the Study
Independent variables | Dependent variables | |
---|---|---|
Touch style: touching, being touched/asked PTT and pointing | Physiological measure | Subjective rating |
Body part: hand vs buttock | ||
Condition 1 | ||
Actively touching the robot | Skin conductance | |
Condition 2 | ||
Passively being touched/asked PTT by the robot | Perceived impression of the robot (age, gender) | |
Condition 3 | ||
Pointing to the robot (control) | Perceived impression of the interaction (friendliness, comfort and arousal) | |
Covariates (measured, but not randomized/controlled): | ||
Demographic info. (age, gender, discipline, sexual orientation and robot experience) | ||
Attitude towards robots (NARS*) |
7 Appendix 2: Monologue Text and Voice Setting of the Pepper Robot
Monologue text: “Hello, my name is Alex. I am a male humanoid robot. Welcome to this touch experiment. At the experiment, I will ask you to touch my body. Sometimes I will ask you to point to my body, and sometimes I will touch your body. When I ask you to touch me, please touch me with your dominant hand; and when I ask you to point at me, please point at me with your dominant finger; please keep your other hand with the sensor. OK, let’s get started.”
Voice setting: The voice setting in the software ChoregrapheFootnote 9 version 2.5.5 was controlled by two parameters (i.e., voice shaping and voice speed). The voice shaping regulates the tone and pitch of the voice and was set at 50% (range 50–150%; the lower the value, the deeper the voice. The voice speed was set at 75% (range of 50–200%).
8 Appendix 3. Pre-experiment Questionnaire
9 Appendix 4: Post-experiment Questionnaire
10 Appendix 5: Non-significant Results Relating to Phasic Responses
Section | Non-significant results | |
---|---|---|
3.2. Differences in average phasic responses | No significant effects were found for either the main factors or the interaction Touch style: F(2, 40) = 1.537, p = 0.227, partial ƞ2 = 0.071, power = 0.307; Body part: F(1, 20) = 1.577, p = 0.224, partial ƞ2 = 0.072, power = 0.223; Touch style*body part: F(1.36, 24.17) = 0.676, p = 0.424, partial ƞ2 = 0.042, power = 0.190 (sphericity not assumed, degrees of freedom adjusted using the Greenhouse–Geisser correction) | |
3.3. Peak analysis (rise time, amplitude and recovery time) | 3.3.1. Peak rise time | No significant effect was found for the interaction Touch style*Body part: F(2, 40) = 2.454, p = 0.099, partial ƞ2 = 0.109, power = 0.465 |
3.3.2. Peak amplitude | No significant effects were found for either body part or the interaction. Body part: F(1, 20) = 0.188, p ≥ 0.05, partial ƞ2 = 0.009, power = 0.070; Touch style*Body part: F(2, 40) = 1.039, p ≥ 0.05, partial ƞ2 = 0.049, power = 0.219 | |
3.3.3. Peak recovery time | No significant effects were found for either the main factors or the interaction. Touch style: F(2, 40) = 0.324, p ≥ 0.05, partial ƞ2 = 0.016, power = 0.098; Body part: F(1, 20) = 0.091, p ≥ 0.05, partial ƞ2 = 0.005, power = 0.060; Touch style*Body part: F(2, 40) = 0.556, p ≥ 0.05, partial ƞ2 = 0.027, power = 0.136 |
Notes
Please note that neither the Nao robot nor the Pepper robot possess visible genitalia.
iMotions (version 7; iMotions A/S, Frederiksberg Allé 1–3, 1621 København V, Denmark); https://imotions.com/gsr/.
IBM Corp. Released 2017. IBM SPSS Statistics for Windows, Version 25.0. Armonk, NY: IBM Corp.
RStudio Team (2016, version 1.0.136). RStudio: Integrated Development for R. RStudio, Inc., Boston, MA. http://www.rstudio.com/.
Softbank Robotics Europe SAS (formerly Aldebaran Robotics; 43, rue du Colonel Pierre Avia, 75,015 Paris, France); http://doc.aldebaran.com/2-4/software/choregraphe/index.html
References
App B, McIntosh DN, Reed CL, Hertenstein MJ (2011) Nonverbal channel use in communication of emotion: how may depend on why. Emotion 11:603–617. https://doi.org/10.1037/a0023164
Avraham G, Nisky I, Fernandes HL, Acuna DE, Kording KP, Loeb GE, Karniel A (2012) Toward perceiving robots as humans: three handshake models face the Turing-like handshake test. IEEE Trans Haptics 5:196–207. https://doi.org/10.1109/TOH.2012.16
Bailenson JN, Yee N (2008) Virtual interpersonal touch: Haptic interaction and copresence in collaborative virtual environments. Multimed Tools Appl 37:5–14. https://doi.org/10.1007/s11042-007-0171-2
Boucsein W (2012) Electrodermal activity, 2nd edn. Springer, Boston. https://doi.org/10.1007/978-1-4614-1126-0
Breazeal C (2003) Toward sociable robots. Robot Auton Syst 42:167–175. https://doi.org/10.1016/S0921-8890(02)00373-1
Bush E (2001) The use of human touch to improve the well-being of older adults: A holistic nursing intervention. J Holist Nurs 19:256–270. https://doi.org/10.1177/089801010101900306
Camps J, Tuteleers C, Stouten J, Nelissen J (2013) A situational touch: How touch affects people’s decision behavior. Soc Influ 8:237–250. https://doi.org/10.1080/15534510.2012.719479
Chang SO (2001) The conceptual structure of physical touch in caring. J Adv Nurs 33:820–827. https://doi.org/10.1046/j.1365-2648.2001.01721.x
Chen TL, King C-HA, Thomaz AL, Kemp CC (2014) An investigation of responses to robot-initiated touch in a nursing context. Int J of Soc Robot 6:141–161. https://doi.org/10.1007/s12369-013-0215-x
Coan JA, Schaefer HS, Davidson RJ (2006) Lending a hand: Social regulation of the neural response to threat. Psychol Sci 17:1032–1039. https://doi.org/10.1111/j.1467-9280.2006.01832.x
Collier G (1985) Emotional expression. Psychology Press, New York. https://doi.org/10.4324/9781315802411
Cramer H, Kemper N, Amin A, Wielinga B, Evers V (2009) ‘Give me a hug’: the effects of touch and autonomy on people’s responses to embodied social agents. Comp Anim Virtual Worlds 20:437–445. https://doi.org/10.1002/cav.317
Cranny-Francis A (2011) Semefulness: a social semiotics of touch. Soc Semiot 21:463–481. https://doi.org/10.1080/10350330.2011.591993
Cross ES, Hortensius R, Wykowska A (2019) From social brains to social robots: applying neurocognitive insights to human–robot interaction. Phil Trans R Soc B. https://doi.org/10.1098/rstb.2018.0024
Crusco AH, Wetzel CG (1984) The Midas touch: The effects of interpersonal touch on restaurant tipping. Pers Soc Psychol Bull 10:512–517. https://doi.org/10.1177/0146167284104003
Dawson ME, Schell AM, Courtney CG (2011) The skin conductance response, anticipation, and decision-making. J Neurosci Psychol Econ 4:111–116. https://doi.org/10.1037/a0022619
Dawson ME, Schell AM, Filion DL (2007) The electrodermal system. In: Cacioppo JT, Tassinary LG, Berntson G (eds) Handbook of psychophysiology, 3rd edn. Cambridge University Press, New York, pp 159–181
Ditzen B, Neumann ID, Bodenmann G, von Dawans B, Turner RA, Ehlert U, Heinrichs M (2007) Effects of different kinds of couple interaction on cortisol and heart rate responses to stress in women. Psychoneuroendocrinology 32:565–574. https://doi.org/10.1016/j.psyneuen.2007.03.011
Dolin DJ, Booth-Butterfield M (1993) Reach out and touch someone: analysis of nonverbal comforting responses. Commun Q 41:383–393. https://doi.org/10.1080/01463379309369899
Dougherty EG, Scharfe H (2011) Initial formation of trust: designing an interaction with Geminoid-DK to promote a positive attitude for cooperation. In: International conference on social robotics. Springer, pp 95–103
Drescher VM, Whitehead WE, Morrill-Corbin ED, Cataldo MF (1985) Physiological and subjective reactions to being touched. Psychophysiology 22:96–100. https://doi.org/10.1111/j.1469-8986.1985.tb01565.x
van Erp JBF, Toet A (2013) How to touch humans: guidelines for social agents and robots that can touch. In: 2013 humaine association conference on affective computing and intelligent interaction, pp 780–785. https://doi.org/10.1109/ACII.2013.145
van Erp JBF, Toet A (2015) Social touch in human-computer interaction. Front Digit Humanit. https://doi.org/10.3389/fdigh.2015.00002
Faul F, Erdfelder E, Buchner A, Lang A-G (2009) Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses. Behav Res Methods 41:1149–1160. https://doi.org/10.3758/BRM.41.4.1149
Field T (2001) Touch. MIT Press, Cambridge
Field T (2010) Touch for socioemotional and physical well-being: a review. Dev Rev 30:367–383. https://doi.org/10.1016/j.dr.2011.01.001
Fukuda H, Shiomi M, Nakagawa K, Ueda K (2012) ‘Midas touch’ in human–robot interaction: Evidence from event-related potentials during the ultimatum game. In: Proceedings of the seventh annual ACM/IEEE international conference on Human–Robot Interaction (HRI ’12), ACM, pp 131–132. https://doi.org/10.1145/2157689.2157720
Gallace A, Spence C (2010) The science of interpersonal touch: An overview. Neurosci Biobehav Rev 34:246–259. https://doi.org/10.1016/j.neubiorev.2008.10.004
Gazzaniga MS, Ivry RB, Mangun GR (2013) Cognitive neuroscience: the biology of the mind, 4th edn. W.W. Norton, New York
Goengoer F, Tutoy O (2019) Design and implementation of a facial character analysis algorithm for humanoid robots. Robotica 37:1850–1866. https://doi.org/10.1017/S0263574719000304
Gonsior B, Sosnowski S, Buß M, Wollherr D, Kühnlenz K (2012) An emotional adaption approach to increase helpfulness towards a robot. In: 2012 IEEE/RSJ international conference on intelligent robots and systems, pp 2429–2436. https://doi.org/10.1109/IROS.2012.6385941
Grewen KM, Anderson BJ, Girdler SS, Light KC (2003) Warm partner contact is related to lower cardiovascular reactivity. Behav Med 29:123–130. https://doi.org/10.1080/08964280309596065
Guéguen N (2002) Touch, awareness of touch, and compliance with a request. Percept Mot Skills 95:355–360. https://doi.org/10.2466/pms.2002.95.2.355
Harlow HF (1958) The nature of love. Am Psychol 13:673–685. https://doi.org/10.1037/h0047884
Havranek MM, Bolliger B, Roos S, Pryce CR, Quednow BB, Seifritz E (2016) Uncontrollable and unpredictable stress interacts with subclinical depression and anxiety scores in determining anxiety response. Stress 19:53–62. https://doi.org/10.3109/10253890.2015.1117449
Heinrichs M, Baumgartner T, Kirschbaum C, Ehlert U (2003) Social support and oxytocin interact to suppress cortisol and subjective responses to psychosocial stress. Biol Psychiat 54:1389–1398. https://doi.org/10.1016/S0006-3223(03)00465-7
Henricson M, Ersson A, Määttä S, Segesten K, Berglund AL (2008) The outcome of tactile touch on stress parameters in intensive care: a randomized controlled trial. Complement Ther Clin Pract 14:244–254. https://doi.org/10.1016/j.ctcp.2008.03.003
Hertenstein MJ, Holmes R, McCullough M, Keltner D (2009) The communication of emotion via touch. Emotion 9:566–573. https://doi.org/10.1037/a0016108
Hertenstein MJ, Keltner D, App B, Bulleit BA, Jaskolka AR (2006) Touch communicates distinct emotions. Emotion 6:528–533. https://doi.org/10.1037/1528-3542.6.3.528
Hertenstein MJ, Verkamp JM, Kerestes AM, Holmes RM (2006) The communicative functions of touch in humans, nonhuman primates, and rats: a review and synthesis of the empirical research. Genet Soc Gen Psychol Monogr 132:5–94. https://doi.org/10.3200/MONO.132.1.5-94
Hinds PJ, Roberts TL, Jones H (2004) Whose job is it anyway? A study of human–robot interaction in a collaborative task. Hum Comput Interact 19:151–181
Hirano T, Shiomi M, Iio T, Kimoto M, Tanev I, Shimohara K, Hagita N (2018) How do communication cues change impressions of human–robot touch interaction? Int J of Soc Robotics 10:21–31. https://doi.org/10.1007/s12369-017-0425-8
Johnson LC, Lubin A (1966) Spontaneous electrodermal activity during waking and sleeping. Psychophysiology 3:8–17. https://doi.org/10.1111/j.1469-8986.1966.tb02673.x
Jones SE, Yarbrough AE (1985) A naturalistic study of the meanings of touch. Commun Monogr 52:19–56. https://doi.org/10.1080/03637758509376094
Joule R-V, Guéguen N (2007) Touch, compliance, and awareness of tactile contact. Percept Mot Skills 104:581–588. https://doi.org/10.2466/pms.104.2.581-588
Jourard SM (1966) An exploratory study of body-accessibility. Br J Soc Clin Psychol 5:221–231. https://doi.org/10.1111/j.2044-8260.1966.tb00978.x
Jung Y, Lee KM (2004) Effects of physical embodiment on social presence of social robots. In: 7th annual international workshop on presence (PRESENCE 2004), pp 80–87
Kawamichi H, Kitada R, Yoshihara K, Takahashi HK, Sadato N (2015) Interpersonal touch suppresses visual processing of aversive stimuli. Front Hum Neurosci. https://doi.org/10.3389/fnhum.2015.00164
Lang PJ (1995) The emotion probe: Studies of motivation and attention. Am Psychol 50:372–385. https://doi.org/10.1037/0003-066X.50.5.372
Lee KM (2004) Presence, explicated. Commun Theory 14:27–50. https://doi.org/10.1111/j.1468-2885.2004.tb00302.x
Lee KM, Jung Y, Kim J, Kim SR (2006) Are physically embodied social agents better than disembodied social agents? The effects of physical embodiment, tactile interaction, and people’s loneliness in human–robot interaction. Int J Hum Comput Stud 64:962–973. https://doi.org/10.1016/j.ijhcs.2006.05.002
Levy D (2007) Love and sex with robots: The evolution of human-robot relations. HarperCollins, New York
Li JJ, Ju W, Reeves B (2017) Touching a mechanical body: tactile contact with body parts of a humanoid robot is physiologically arousing. JHRI 6:118–130
Linden DJ (2015) Touch: the science of hand, heart and mind. Penguin Books, London
Maddaluno O, Guidali G, Zazio A, Miniussi C, Bolognini N (2020) Touch anticipation mediates cross-modal Hebbian plasticity in the primary somatosensory cortex. Cortex 126:173–181. https://doi.org/10.1016/j.cortex.2020.01.008
Major B, Heslin R (1982) Perceptions of cross-sex and same-sex nonreciprocal touch: It is better to give than to receive. J Nonverbal Behav 6:148–162. https://doi.org/10.1007/BF00987064
Marchesi S, Ghiglino D, Ciardo F, Baykara E, Wykowska A (2019) Do we adopt the intentional stance toward humanoid robots? Front Psychol. https://doi.org/10.3389/fpsyg.2019.00450
McCance RA, Otley M (1951) Course of the blood urea in newborn rats, pigs and kittens. J Physiol 113:18–22. https://doi.org/10.1113/jphysiol.1951.sp004552
Morrison I, Löken LS, Olausson H (2010) The skin as a social organ. Exp Brain Res 204:305–314. https://doi.org/10.1007/s00221-009-2007-y
Moyle W, Jones CJ, Murfield JE, Thalib L, Beattie ER, Shum DK, O’Dwyer ST, Mervin MC, Draper BM (2017) Use of a robotic seal as a therapeutic tool to improve dementia symptoms: a cluster-randomized controlled trial. J Am Med Dir Assoc 18:766–773. https://doi.org/10.1016/j.jamda.2017.03.018
Mukai T, Hirano S, Nakashima H, Kato Y, Sakaida Y, Guo S, Hosoe S (2010) Development of a nursing-care assistant robot RIBA that can lift a human in its arms. In: 2010 IEEE/RSJ international conference on intelligent robots and systems, pp 5996–6001. https://doi.org/10.1109/IROS.2010.5651735
Nakagawa K, Shiomi M, Shinozawa K, Matsumura R, Ishiguro H, Hagita N (2011) Effect of robot’s active touch on people’s motivation. In: Proceedings of the 6th international conference on human-robot interaction (HRI ’11), ACM, pp 465–472. https://doi.org/10.1145/1957656.1957819
Nomura T, Kanda T, Suzuki T (2006) Experimental investigation into influence of negative attitudes toward robots on human–robot interaction. AI & oc 20:138–150. https://doi.org/10.1007/s00146-005-0012-7
Nomura T, Kanda T, Suzuki T, Kato K (2004) Psychology in human-robot communication: An attempt through investigation of negative attitudes and anxiety toward robots. In: RO-MAN 2004. 13th IEEE international workshop on robot and human interactive communication (IEEE catalog no.04TH8759), IEEE, pp 35–40. https://doi.org/10.1109/ROMAN.2004.1374726
Nomura T, Kanda T, Suzuki T, Kato K (2008) Prediction of human behavior in human–robot interaction using psychological scales for anxiety and negative attitudes toward robots. IEEE Trans Robot 24:442–451. https://doi.org/10.1109/TRO.2007.914004
Nomura T, Suzuki T, Kanda T, Kato K (2006) Measurement of negative attitudes toward robots. IS 7:437–454. https://doi.org/10.1075/is.7.3.14nom
Nummenmaa L, Glerean E, Hari R, Hietanen JK (2014) Bodily maps of emotions. Proc Natl Acad Sci 111:646–651. https://doi.org/10.1073/pnas.1321664111
Robinson H, MacDonald B, Broadbent E (2015) Physiological effects of a companion robot on blood pressure of older people in residential care facility: a pilot study. Aust J Ageing 34:27–32. https://doi.org/10.1111/ajag.12099
Robinson H, MacDonald B, Kerse N, Broadbent E (2013) The psychosocial effects of a companion robot: a randomized controlled trial. J Am Med Dir Assoc 14:661–667. https://doi.org/10.1016/j.jamda.2013.02.007
Schneider S, Berger I, Riether N, Wrede S, Wrede B (2012) Effects of different robot interaction strategies during cognitive tasks. In: Ge SS, Khatib O, Cabibihan JJ, Simmons R, Williams MA (eds) Social robotics. Springer, Berlin, pp 496–505. https://doi.org/10.1007/978-3-642-34103-8_50
Sefidgar YS, MacLean KE, Yohanan S, Van der Loos HM, Croft EA, Garland EJ (2016) Design and evaluation of a touch-centered calming interaction with a social robot. IEEE Trans Affect Comput 7:108–121. https://doi.org/10.1109/TAFFC.2015.2457893
Shiomi M, Nakagawa K, Shinozawa K, Matsumura R, Ishiguro H, Hagita N (2017) Does a robot’s touch encourage human effort? Int J of Soc Robot 9:5–15. https://doi.org/10.1007/s12369-016-0339-x
Society for Psychophysiological Research Ad Hoc Committee on Electrodermal Measures (2012) Publication recommendations for electrodermal measurements: publication standards for EDA. Psychophysiology 49:1017–1034. https://doi.org/10.1111/j.1469-8986.2012.01384.x
Stier DS, Hall JA (1984) Gender differences in touch: An empirical and theoretical review. J Pers Soc Psychol 47:440–459. https://doi.org/10.1037/0022-3514.47.2.440
Suk H-J, Jeong S-H, Yang T-H, Kwon D-S (2009) Tactile sensation as emotion elicitor. KANSEI Eng Int 8:153–158. https://doi.org/10.5057/E081120-ISES06
Świdrak J, Pochwatko G, Navarro X, Osęka L, Doliński D (2020) The joint influence of social status and personal attitudes in a contact and open versus a noncontact and homophobic culture on the virtual Midas touch. Virtual Real. https://doi.org/10.1007/s10055-019-00423-8
Syrdal DS, Dautenhahn K, Koay KL, Walters ML (2009) The negative attitudes towards robots scale and reactions to robot behaviour in a live human-robot interaction study. In: Adaptive and emergent behaviour and complex systems: proceedings of the 23rd convention of the society for the study of Artificial Intelligence and Simulation Of Behaviour (AISB 2009), pp 109–115
Tanaka F, Cicourel A, Movellan JR (2007) Socialization between toddlers and robots at an early childhood education center. Proc Natl Acad Sci 104:17954–17958. https://doi.org/10.1073/pnas.0707769104
Tomita M (2008) Exploratory study of touch zones in college students on two campuses. Calif J Health Prom 6:1–22. https://doi.org/10.32398/cjhp.v6i1.1289
United Nations Economic Commission for Europe, International Federation of Robotics (2005) World robotics 2005. UN, New York
Vallbo Å, Olausson H, Wessberg J, Norrsell U (1993) A system of unmyelinated afferents for innocuous mechanoreception in the human skin. Brain Res 628:301–304. https://doi.org/10.1016/0006-8993(93)90968-S
Vlachos E, Jochum E, Demers LP (2016) The effects of exposure to different social robots on attitudes toward preferences. Interact Stud 17(3):390–404
Vrana SR, Rollock D (1998) Physiological response to a minimal social encounter: Effects of gender, ethnicity, and social context. Psychophysiology 35:462–469. https://doi.org/10.1111/1469-8986.3540462
Walker R, Bartneck C (2013) The pleasure of receiving a head massage from a robot. In: 2013 IEEE RO-MAN, pp 807–813. https://doi.org/10.1109/ROMAN.2013.6628412
Whitcher SJ, Fisher JD (1979) Multidimensional reaction to therapeutic touch in a hospital setting. J Pers Soc Psychol 37:87–96. https://doi.org/10.1037/0022-3514.37.1.87
Wilhelm FH, Kochar AS, Roth WT, Gross JJ (2001) Social anxiety and response to touch: Incongruence between self-evaluative and physiological reactions. Biol Psychol 58:181–202. https://doi.org/10.1016/S0301-0511(01)00113-2
Willemse CJAM, van Erp JBF (2019) Social touch in human–robot interaction: robot-initiated touches can induce positive responses without extensive prior bonding. Int J Soc Robot 11:285–304. https://doi.org/10.1007/s12369-018-0500-9
Willemse CJAM, Toet A, van Erp JBF (2017) Affective and behavioral responses to robot-initiated social touch: toward understanding the opportunities and limitations of physical contact in human–robot interaction. Front ICT. https://doi.org/10.3389/fict.2017.00012
Wykowska A, Chaminade T, Cheng G (2016) Embodied artificial agents for understanding human social cognition. Philos Trans R Soc B. https://doi.org/10.1098/rstb.2015.0375
Yu R, Hui E, Lee J, Poon D, Ng A, Sit K, Ip K, Yeung F, Wong M, Shibata T, Woo J (2015) Use of a therapeutic, socially assistive pet robot (PARO) in improving mood and stimulating social interaction and communication for people with dementia: study protocol for a randomized controlled trial. JMIR Res Protoc. https://doi.org/10.2196/resprot.4189
Zhou Y, Fischer MH (2019) AI love you: developments in human–robot intimate relationships. Springer, Cham
Acknowledgements
We thank all participants, as well as the Cognitive Sciences Structure at University of Potsdam Germany that provided the seed funding and the Potsdam Embodied Cognition Group (PECoG) for stimulating discussions and valuable feedback. In particular, Alex Miklashevsky for sharing data analysis experiences and Arianna Felisatti for recruitment assistance. We are grateful to members of the Telematics Department, Technical University of Applied Sciences Wildau (Benjamin Stahl, Janine Bressler, Julia Reinke) for their expertise and effort in programming the robot and assisting with the experiment. Moreover, we thank Professor Ralf Brand for lending equipment and analysis experience with his team (Sinika Timme, Michaela Schinköth). Finally, we appreciate valuable feedback from members of the Italian National Research Centre in Rome (Amedeo Cesta, Gabriella Cortellessa and Francesca Fracasso) and students and staff from the Social Psychology Division (Professor Barbara Krahé) at University of Potsdam.
Funding
Open Access funding enabled and organized by Projekt DEAL. This study was funded by internal seed funds of the Cognitive Sciences Structure at University of Potsdam (no ref. no).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Zhou, Y., Kornher, T., Mohnke, J. et al. Tactile Interaction with a Humanoid Robot: Effects on Physiology and Subjective Impressions. Int J of Soc Robotics 13, 1657–1677 (2021). https://doi.org/10.1007/s12369-021-00749-x
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-021-00749-x