Feeling of Safety and Comfort towards a Socially Assistive Unmanned Aerial Vehicle That Monitors People in a Virtual Home
<p>3D representation in isometric perspective of the UAV’s monitoring altitude, <math display="inline"><semantics> <msub> <mi>z</mi> <mi>m</mi> </msub> </semantics></math>, which is determined by the person’s height, <math display="inline"><semantics> <msub> <mi>z</mi> <mi>p</mi> </msub> </semantics></math>, plus the relative monitoring altitude, <math display="inline"><semantics> <msub> <mi>z</mi> <mi>r</mi> </msub> </semantics></math>.</p> "> Figure 2
<p>3D representation in isometric perspective of the trajectory according to the UAV’s monitoring radius: (<b>a</b>) elliptical trajectory closer to the person’s face (<math display="inline"><semantics> <mrow> <msub> <mi>R</mi> <mi>x</mi> </msub> <mo>></mo> <msub> <mi>R</mi> <mi>y</mi> </msub> </mrow> </semantics></math>); (<b>b</b>) circular trajectory in which the monitoring radius is constant (<math display="inline"><semantics> <mrow> <msub> <mi>R</mi> <mi>x</mi> </msub> <mo>=</mo> <msub> <mi>R</mi> <mi>y</mi> </msub> </mrow> </semantics></math>); (<b>c</b>) elliptical trajectory farther from the person’s face (<math display="inline"><semantics> <mrow> <msub> <mi>R</mi> <mi>x</mi> </msub> <mo><</mo> <msub> <mi>R</mi> <mi>y</mi> </msub> </mrow> </semantics></math>).</p> "> Figure 3
<p>General diagram of the UAV s imulator which receives from the VR visualiser the information concerning the person’s avatar to calculate the reference trajectories used by the controller to guide the UAV in the monitoring process while returns the aircraft’s position and orientation to represent its flight in a virtual home environment.</p> "> Figure 4
<p>Variables of the parametric equation of the ellipse centred at the <math display="inline"><semantics> <mrow> <mi>X</mi> <mi>Y</mi> </mrow> </semantics></math> position of the person and rotated according to their orientation.</p> "> Figure 5
<p>Determination of the safety position <math display="inline"><semantics> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>s</mi> <mi>p</mi> </mrow> </msub> <mo>,</mo> <msub> <mi>y</mi> <mrow> <mi>s</mi> <mi>p</mi> </mrow> </msub> <mo>,</mo> <msub> <mi>z</mi> <mrow> <mi>s</mi> <mi>p</mi> </mrow> </msub> <mo>)</mo> </mrow> </semantics></math> to which the UAV is approaching to start the elliptical monitoring lap around the person.</p> "> Figure 6
<p>Graphical representation of the trajectory planner’s states: (<b>a</b>) states 0—home, 1—takeoff, and 2—orientation towards the person; (<b>b</b>) state 3—approximation; (<b>c</b>) state 4—waiting in safety position; (<b>d</b>) state 5—orbit around the person; (<b>e</b>) state 6—data capture; (<b>f</b>) state 7—motion to safety position; (<b>g</b>) state 8—orientation towards the base; (<b>h</b>) state 9—return to base; (<b>i</b>) state 10—yaw angle adjustment; (<b>j</b>) state 11—landing.</p> "> Figure 7
<p>Results of the tests to verify the new ellipsoidal trajectory planner carried out using the UAV simulator (part of the VR platform and implemented in MATLAB/Simulink<sup>®</sup>). For each test, the following is represented: (1) trajectory generated by the planner (in blue) against the actual trajectory performed by the quadrotor model as the result of the action of the GPI controller (in red); (2) orientation of the UAV’s camera by means of arrows whose colour change over time; (3) way-points: base position (blue circle), safety position (green circle), and person’s position and orientation (pink circle and arrow).</p> "> Figure 8
<p>Architecture of the distributed platform.</p> "> Figure 9
<p>Distribution of responses for each question for the experiment with younger adults.</p> "> Figure 10
<p>User preference for each of the variables measured for the experiment with younger adults.</p> "> Figure 11
<p>Distribution of responses for each question for older adults.</p> "> Figure 12
<p>User preference for each of the variables measured for older adults.</p> ">
Abstract
:1. Introduction
2. General Description of the System
3. Trajectory Planning
3.1. Equations of the Ellipse
- is the X-coordinate of the centre [m],
- is the Y-coordinate of the centre [m],
- is the radius on the X-axis [m],
- is the radius on the Y-axis [m],
- is the rotation angle [rad], and
- is the parameter, which ranges from 0 to .
3.2. Determination of the Safety Position
3.2.1. Calculation of the Intersection Points with the Rotated Ellipse
- Solution 1:
- Solution 2:
- Solution 1:
- Solution 2:
3.2.2. Determination of the Nearest Intersection Point (Safety Position)
3.3. Ellipsoidal Trajectory around the Person
- Quadrant I: ,
- Quadrant II: ,
- Quadrant III: ,
- Quadrant IV: ,
3.4. States of the Trajectory Planner
- STATE 0—Home: Represents the UAV waiting in its base position, , and initially oriented with a yaw angle, , which by default is equal to 0 rad. When it receives the instruction to start the monitoring process, it transits to state 1.
- STATE 1—Takeoff: This is the first manoeuvre to raise the UAV, with a constant speed , from the base level, , until the monitoring altitude, . This level will be equal to the person’s height (measured in the centre of their head, ) plus the relative altitude parameter (), which can be positive, zero, or even negative to consider the three scenarios mentioned before (above their head, coinciding with it or below it). Thus, when the UAV reaches the monitoring level, , the planner switches to state 2.
- STATE 2—Orientation Towards the Person: After takeoff, the UAV is requested to turn, varying the yaw angle with a speed defined by , in order to find the person with its on board camera. Since the person’s position is known at each moment, it is possible to calculate the final (target) yaw angle, , which will be the difference between the angle and the camera’s angle as can be observed in Figure 6a. Therefore, the yaw angle is gradually modified until the UAV’s camera is focused on the person. At this moment, the planner transits to state 3.
- STATE 3—Approximation: The UAV must approach the person (already centred on the camera’s view) to turn around them in the next states. As already detailed, the UAV must reach the safety position whose coordinates will be given by the nearest intersection of the (imaginary) line joining the positions of the UAV and the person with the ellipse centred and rotated according to their position and orientation, respectively. The relative monitoring altitude should be maintained constant, i.e., the Z coordinate of the safety position will match the monitoring level. The planner will transit to state 4 once the UAV reaches the safety position .
- STATE 4—Waiting in Safety Position: Intermediate state in which the UAV waits for a short time to start later the lap around the person more precisely. Therefore, once the programmed time () has elapsed, it transits to state 5.
- STATE 5—Orbit Around the Person: At this moment, the ellipsoidal rotation of the UAV around the person starts from the safety position (whose equivalence to the parameter of the ellipse’s parametric formula is determined according to Equations (17)–(22)), while the flight height is kept constant. On the other hand, the UAV’s yaw angle is modified during the ellipsoidal trajectory so that its on board camera is always pointing towards the person. This way, once the UAV’s camera finds the person’s face (when the UAV is in front of the face), the planner switches to state 6. Since the person’s information is known, and the ellipse is rotated according to their orientation, , the position in front of their face, labelled as photo position in Figure 6d, , can be calculated from the ellipse’s parametrisation by taking [rad].
- STATE 6—Data Capture: In the general case, the UAV can remain in the position in front of the person’s face, , to capture images with better accuracy. Once the data capture timer elapses, the planner transitions to state 7. On the contrary, if the UAV is programmed to take pictures of the person for the entire lap around him or her, the waiting time can be set to zero () and the planner would transit directly to state 7. The UAV would continue the lap without stopping.
- STATE 7—Motion and Safety Position: The UAV will continue its ellipsoidal rotation around the monitored person to reach the safety position again. The height of the UAV will be kept constant and the yaw angle will be varied so that the on board camera continues to be pointed at the patient. Mathematically, it is, therefore, equal to state 5 and, consequently, the reference trajectories are determined in a very similar way. Once the UAV reaches the safety position, , the planner transits to state 8.
- STATE 8—Orientation Towards the Base: Once the UAV has completed its ellipsoidal trajectory around the monitored person, it is commanded to turn on itself, gradually adjusting its yaw angle to search for the base with its camera. In this way, a procedure similar to state 2 will be followed, but the final yaw angle is now known in advance. As can be seen in Figure 6g, the UAV must turn rad, so that the UAV’s camera directly points towards the base. At this point, the planner will switch to state 9.
- STATE 9—Return to Base: The aircraft is ordered to return to over the base position while keeping the flight height constant. In this way, the UAV’s movement is the same as state 3, but in the opposite direction, thus the reference trajectories are calculated in a similar way (in this case even the final position is known, so no additional calculation is necessary unlike state 3 in which the safety position was determined). Once the UAV is positioned over its base, at , the planner transits to state 10.
- STATE 10—Yaw Angle Adjustment: Before landing, the yaw angle of the UAV is adjusted to its initial value to prepare the aircraft for future monitoring processes, while the UAV’s position is kept constant. The reference trajectories will be, therefore, analogous to state 2. This way, when the adjustment is completed, the planner will shift to state 11.
- STATE 11—Landing: Finally the UAV is commanded to land at its base (. In this way, the aircraft will descend vertically, with a constant speed defined by the parameter , until reaching the base level, and the planner will transit to the initial state (0) to be ready for the next monitoring process.
3.5. Simulation Results
4. Experimental Setup
4.1. Procedure
4.2. Questionnaire
4.3. Participants and Data Collection
4.4. Data Analysis
5. Results
5.1. Results of the Experiment with Younger Adults (Relatives)
5.1.1. Results for Relative Monitoring Altitude with Younger Adults
5.1.2. Results for Monitoring Velocity with Younger Adults
5.1.3. Results for Monitoring Radius with Younger Adults
5.2. Results of the Experiment with Older Adults (with Physical Impairments)
5.2.1. Results for Relative Monitoring Altitude with Older Adults
5.2.2. Results for Monitoring Velocity with Older Adults
5.2.3. Results for Monitoring Radius with Older Adults
6. Discussion
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
GPI | Generalised Proportional Integral |
HRI | Human-Robot Interaction |
IQR | Interquartile Range |
MQTT | Message Queue Telemetry Transport |
SD | Standard Deviation |
UAV | Unmanned Aerial Vehicle |
VR | Virtual Reality |
References
- Nocentini, O.; Fiorini, L.; Acerbi, G.; Sorrentino, A.; Mancioppi, G.; Cavallo, F. A Survey of Behavioral Models for Social Robots. Robotics 2019, 8, 54. [Google Scholar] [CrossRef] [Green Version]
- Wojciechowska, A.; Frey, J.; Mandelblum, E.; Amichai-Hamburger, Y.; Cauchard, J.R. Designing Drones: Factors and Characteristics Influencing the Perception of Flying Robots. In Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies; Association for Computing Machinery: New York, NY, USA, 2019; Volume 3, p. 111. [Google Scholar] [CrossRef]
- Martín Rico, F.; Rodríguez-Lera, F.; Clavero, J.; Guerrero-Higueras, A.; Matellán Olivera, V. An Acceptance Test for Assistive Robots. Sensors 2020, 20, 3912. [Google Scholar] [CrossRef]
- Cavallo, F.; Esposito, R.; Limosani, R.; Manzi, A.; Bevilacqua, R.; Felici, E.; Di Nuovo, A.; Cangelosi, A.; Lattanzio, F.; Dario, P. Robotic Services Acceptance in Smart Environments With Older Adults: User Satisfaction and Acceptability Study. J. Med. Internet Res. 2018, 20, e264. [Google Scholar] [CrossRef] [PubMed]
- Garcia-Salguero, M.; Gonzalez-Jimenez, J.; Moreno, F.A. Human 3D Pose Estimation with a Tilting Camera for Social Mobile Robot Interaction. Sensors 2019, 19, 4943. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Lewis, M.; Sycara, K.; Walker, P. The Role of Trust in Human-Robot Interaction. In Foundations of Trusted Autonomy; Springer: Cham, Switzerland, 2018; pp. 135–159. [Google Scholar] [CrossRef] [Green Version]
- McMurray, J.; Strudwick, G.; Forchuk, C.; Morse, A.; Lachance, J.; Baskaran, A.; Allison, L.; Booth, R. The Importance of Trust in the Adoption and Use of Intelligent Assistive Technology by Older Adults to Support Aging in Place: Scoping Review Protocol. JMIR Res. Protoc. 2017, 6, e218. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Yusif, S.; Soar, J.; Hafeez-Baig, A. Older people, assistive technologies, and the barriers to adoption: A systematic review. Int. J. Med. Inform. 2016, 94, 112–116. [Google Scholar] [CrossRef] [Green Version]
- Okamura, K.; Yamada, S. Adaptive trust calibration for human-AI collaboration. PLoS ONE 2020, 15, e0229132. [Google Scholar] [CrossRef] [Green Version]
- Hoffman, R.R.; Johnson, M.; Bradshaw, J.M.; Underbrink, A. Trust in Automation. IEEE Intell. Syst. 2013, 28, 84–88. [Google Scholar] [CrossRef]
- Langer, A.; Feingold-Polak, R.; Mueller, O.; Kellmeyer, P.; Levy-Tzedek, S. Trust in socially assistive robots: Considerations for use in rehabilitation. Neurosci. Biobehav. Rev. 2019, 104, 231–239. [Google Scholar] [CrossRef]
- Song, Y.; Luximon, Y. Trust in AI Agent: A Systematic Review of Facial Anthropomorphic Trustworthiness for Social Robot Design. Sensors 2020, 20, 5087. [Google Scholar] [CrossRef]
- Gompei, T.; Umemuro, H. Factors and Development of Cognitive and Affective Trust on Social Robots. In Social Robotics; Ge, S.S., Cabibihan, J.J., Salichs, M.A., Broadbent, E., He, H., Wagner, A.R., Castro-González, Á., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 45–54. [Google Scholar] [CrossRef]
- de Graaf, M.M.; Allouch, S.B.; Klamer, T. Sharing a life with Harvey: Exploring the acceptance of and relationship-building with a social robot. Comput. Hum. Behav. 2015, 43, 1–14. [Google Scholar] [CrossRef]
- Mcknight, D.H.; Carter, M.; Thatcher, J.B.; Clay, P.F. Trust in a Specific Technology: An Investigation of Its Components and Measures. ACM Trans. Manag. Inf. Syst. 2011, 2, 12. [Google Scholar] [CrossRef]
- Belmonte, L.; Garcia, A.S.; Segura, E.; Novais, P.J.; Morales, R.; Fernandez-Caballero, A. Virtual Reality Simulation of a Quadrotor to Monitor Dependent People at Home. IEEE Trans. Emerg. Top. Comput. 2020. [Google Scholar] [CrossRef]
- Sadka, O.; Giron, J.; Friedman, D.A.; Zuckerman, O.; Erel, H. Virtual-reality as a Simulation Tool for Non-humanoid Social Robots. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–9. [Google Scholar] [CrossRef]
- Templin, T.; Popielarczyk, D. The Use of Low-Cost Unmanned Aerial Vehicles in the Process of Building Models for Cultural Tourism, 3D Web and Augmented/Mixed Reality Applications. Sensors 2020, 20, 5457. [Google Scholar] [CrossRef]
- Górriz, J.M.; Ramírez, J.; Ortíz, A.; Martínez-Murcia, F.J.; Segovia, F.; Suckling, J.; Leming, M.; Zhang, Y.D.; Álvarez-Sánchez, J.R.; Bologna, G.; et al. Artificial intelligence within the interplay between natural and artificial computation: Advances in data science, trends and applications. Neurocomputing 2020, 410, 237–270. [Google Scholar] [CrossRef]
- Martinez-Gomez, J.; Fernández-Caballero, A.; Garcia-Varea, I.; Rodriguez, L.; Romero-Gonzalez, C. A taxonomy of vision systems for ground mobile robots. Int. J. Adv. Robot. Syst. 2014, 11, 111. [Google Scholar] [CrossRef] [Green Version]
- Fong, T.; Nourbakhsh, I.; Dautenhahn, K. A survey of socially interactive robots. Robot. Auton. Syst. 2003, 42, 143–166. [Google Scholar] [CrossRef] [Green Version]
- Marín-Morales, J.; Llinares, C.; Guixeres, J.; Alcañiz, M. Emotion Recognition in Immersive Virtual Reality: From Statistics to Affective Computing. Sensors 2020, 20, 5163. [Google Scholar] [CrossRef]
- Fernández-Caballero, A.; López, M.T.; Saiz-Valverde, S. Dynamic stereoscopic selective visual attention (DSSVA): Integrating motion and shape with depth in video segmentation. Expert Syst. Appl. 2008, 34, 1394–1402. [Google Scholar] [CrossRef] [Green Version]
- Fernández, M.A.; Fernández-Caballero, A.; López, M.T.; Mira, J. Length–speed ratio (LSR) as a characteristic for moving elements real-time classification. Real-Time Imaging 2008, 9, 49–59. [Google Scholar] [CrossRef] [Green Version]
- Todd, C.; Watfa, M.; Mouden, Y.E.; Sahir, S.; Ali, A.; Niavarani, A.; Lutfi, A.; Copiaco, A.; Agarwal, V.; Afsari, K.; et al. A proposed UAV for indoor patient care. Technol. Health Care 2015, 1–8. [Google Scholar] [CrossRef]
- Sokullu, R.; Balcı, A.; Demir, E. The role of drones in ambient assisted living systems for the elderly. In Enhanced Living Environments: Algorithms, Architectures, Platforms, and Systems; Ganchev, I., Garcia, N.M., Dobre, C., Mavromoustakis, C.X., Goleva, R., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 295–321. [Google Scholar] [CrossRef]
- He, M.; He, J.; Scherer, S. Model-based real-time robust controller for a small helicopter. Mech. Syst. Signal Process. 2021, 146, 107022. [Google Scholar] [CrossRef]
- Huong, D.C.; Huynh, V.T.; Trinh, H. Dynamic Event-Triggered State Observers for a Class of Nonlinear Systems with Time Delays and Disturbances. IEEE Trans. Circuits Syst. II Express Briefs 2020, 67, 3457–3461. [Google Scholar] [CrossRef]
- Belmonte, L.M.; Morales, R.; Fernández-Caballero, A.; Somolinos, J.A. A tandem active disturbance rejection control for a laboratory helicopter with variable-speed rotors. IEEE Trans. Ind. Electron. 2016, 63, 6395–6406. [Google Scholar] [CrossRef]
- Khosiawan, Y.; Nielsen, I. A system of UAV application in indoor environment. Prod. Manuf. Res. 2016, 4, 2–22. [Google Scholar] [CrossRef] [Green Version]
- de Miguel Molina, M.; Campos, V.S.; Ángeles Carabal Montagud, M.; de Miguel Molina, B. Ethics for civil indoor drones: A qualitative analysis. Int. J. Micro Air Veh. 2018, 10, 340–351. [Google Scholar] [CrossRef]
- Fernández-Caballero, A.; Belmonte, L.M.; Morales, R.; Somolinos, J.A. Generalized proportional integral control for an unmanned quadrotor system. Int. J. Adv. Robot. Syst. 2015, 12, 85. [Google Scholar] [CrossRef] [Green Version]
- Belmonte, L.M.; Morales, R.; García, A.S.; Segura, E.; Novais, P.; Fernández-Caballero, A. Trajectory Planning of a Quadrotor to Monitor Dependent People. In Understanding the Brain Function and Emotions; Springer: Cham, Switzerland, 2019; pp. 212–221. [Google Scholar] [CrossRef]
- Castillo, J.C.; Castro-González, Á.; Alonso-Martín, F.; Fernández-Caballero, A.; Salichs, M.Á. Emotion detection and regulation from personal assistant robot in smart environment. In Personal Assistants: Emerging Computational Technologies; Springer International Publishing: Cham, Switzerland, 2018; pp. 179–195. [Google Scholar] [CrossRef]
- Castillo, J.C.; Castro-González, Á.; Fernández-Caballero, A.; Latorre, J.M.; Pastor, J.M.; Fernández-Sotos, A.; Salichs, M.A. Software architecture for smart emotion recognition and regulation of the ageing adult. Cogn. Comput. 2016, 8, 357–367. [Google Scholar] [CrossRef]
- Castillo, J.C.; Fernández-Caballero, A.; Castro-González, Á.; Salichs, M.A.; López, M.T. A Framework for Recognizing and Regulating Emotions in the Elderly. In Ambient Assisted Living and Daily Activities; Pecchia, L., Chen, L.L., Nugent, C., Bravo, J., Eds.; Springer International Publishing: Cham, Switzerland, 2014; pp. 320–327. [Google Scholar]
- Sokolova, M.; Serrano-Cuerda, J.; Castillo, J.; Fernández-Caballero, A. A fuzzy model for human fall detection in infrared video. J. Intell. Fuzzy Syst. 2003, 24, 215–228. [Google Scholar] [CrossRef] [Green Version]
- Dockx, K.; Alcock, L.; Bekkers, E.; Ginis, P.; Reelick, M.; Pelosin, E.; Lagravinese, G.; Hausdorff, J.M.; Mirelman, A.; Rochester, L.; et al. Fall-prone older people’s attitudes towards the use of virtual reality technology for fall prevention. Gerontology 2017, 63, 590–598. [Google Scholar] [CrossRef]
- Ninomiya, T.; Fujita, A.; Suzuki, D.; Umemuro, H. Development of the Multi-dimensional Robot Attitude Scale: Constructs of people’s attitudes towards domestic robots. In International Conference on Social Robotics; Springer: Cham, Switzerland, 2015; pp. 482–491. [Google Scholar] [CrossRef]
- Chiari, L.; van Lummel, R.; Pfeiffer, K.; Lindemann, U.; Zijlstra, W. Deliverable 2.2: Classification of the user’s needs, characteristics and scenarios-update. In Unpublished Report from the EU Project (6th Framework Program, IST Contract No. 045622) Sensing and Action to Support Mobility in Ambient Assisted Living; Department of Health: London, UK, 2009. [Google Scholar]
State | Reference Trajectories | Parameters and Condition [C] |
---|---|---|
0 | : base position [m] | |
: initial yaw angle (by default 0) [rad] | ||
[C] If Instruction is Received → State 1 | ||
1 | : relative monitoring altitude [m] | |
: velocity in Z-Axis [m/s] | ||
where: ; | ||
[C] If State 2 | ||
2 | : angular velocity (yaw) [rad/s] | |
: camera’s angle [rad] | ||
where: ; | ||
[C] If State 3 | ||
3 | : radius on the X-axis [m] | |
: radius on the Y-axis [m] | ||
: diagonal velocity [m/s] | ||
where: | ||
see Section 3.2 | ||
[C] If State 4 | ||
4 | : timer [s] | |
[C] If State 5 | ||
5 | : radius on the X-axis [m] | |
: radius on the Y-axis [m] | ||
: monitoring angular velocity [rad/s] | ||
: camera’s angle [rad] | ||
where: | ||
see Section 3.3 | ||
see Table 3 | [C] If State 6 | |
6 | : timer [s] | |
[C] If State 7 |
State | Reference Trajectories | Parameters and Condition [C] |
---|---|---|
7 | : radius on the X-axis [m] | |
: radius on the Y-axis [m] | ||
: monitoring angular velocity [rad/s] | ||
: camera’s angle [rad] | ||
where: | ||
see Table 3 | [C] If State 8 | |
8 | : angular velocity (yaw) [rad/s] | |
where: ; | ||
[C] If State 9 | ||
9 | : base position [m] | |
: diagonal velocity [m/s] | ||
where: | ||
[C] If State 10 | ||
10 | : angular velocity (yaw) [rad/s] | |
: initial yaw angle (by default 0) [rad] | ||
where: | ||
[C] If State 11 | ||
11 | : base position [m] | |
: velocity in Z-Axis [m/s] | ||
where: | ||
[C] If State 0 |
Parameter | 1st Value (Extreme 1) | 2nd Value (Intermediate) | 3rd Value (Extreme 2) |
---|---|---|---|
Relative Monitoring Altitude | Low/Below | Medium/Centred | High/Above |
[m] | [m] | [m] | |
Monitoring Velocity | Low Velocity | Medium Velocity | High Velocity |
Monitoring Radius | Closer to the Face | Equidistant (Circular) | Farther from the Face |
(Ellipse) | [m] | [m] | [m] |
(Ellipse) | [m] | [m] | [m] |
Description | Value | Units |
---|---|---|
Simulation | ||
Sample Time | [s] | |
Simulation Time | [s] | |
Quadrotor UAV | ||
Initial Position (Base Position) | [m] | |
Initial Yaw Angle | [rad] | |
Camera’s Angle | [rad] | |
Mass | [Kg] | |
Trajectory Planner | ||
Fixed Parameters: | ||
Velocity in Z axis [state 1—landing/state 11—takeoff] | [m/s] | |
Velocity in Diagonal Motion [state 3/state 9] | [m/s] | |
Angular Velocity for Yaw Adjustment | [rad/s] | |
Angular Velocity for Ellipsoidal Motion | [rad/s] | |
Timer of State 4—Waiting in Safety Position | [s] | |
Timer of State 6—Data Capture | [s] | |
Variable Parameters ⇒ see Table 3 and Figure 7 |
Question | Mean | SD | Median | IQR |
---|---|---|---|---|
Altitude | ||||
Safety | ||||
A.Saf1: I felt safe during the monitoring process by the UAV flying below my head | 3.03 | 1.14 | 3.00 | 2.00 |
A.Saf2: I felt safe during the monitoring process by the UAV flying in front of my head | 2.54 | 1.10 | 2.00 | 1.00 |
A.Saf3: I felt safe during the monitoring process by the UAV flying above my head | 4.05 | 0.91 | 4.00 | 1.00 |
Supervision | ||||
A.Sup1: I feel that there is too much supervision by the UAV flying below my head | 2.89 | 1.02 | 3.00 | 1.00 |
A.Sup2: I feel that there is too much supervision by the UAV flying in front of my head | 3.41 | 1.14 | 3.00 | 2.00 |
A.Sup3: I feel that there is too much supervision by the UAV flying above my head | 3.05 | 1.03 | 3.00 | 1.00 |
Distraction | ||||
A.Dis1: I think the UAV would distract me from my daily routine by flying below my head | 3.76 | 1.26 | 4.00 | 2.00 |
A.Dis2: I think the UAV would distract me from my daily routine by flying in front of my head | 4.35 | 0.82 | 5.00 | 1.00 |
A.Dis3: I think the UAV would distract me from my daily routine by flying above my head | 2.43 | 1.04 | 2.00 | 1.00 |
Velocity | ||||
Safety | ||||
V.Saf1: I felt safe during the monitoring process by the UAV flying at a low velocity | 4.19 | 0.84 | 4.00 | 1.00 |
V.Saf2: I felt safe during the monitoring process by the UAV flying at a medium velocity | 4.11 | 0.74 | 4.00 | 1.00 |
V.Saf3: I felt safe during the monitoring process by the UAV flying at a high velocity | 3.43 | 1.19 | 4.00 | 2.00 |
Supervision | ||||
V.Sup1: I feel that there is too much supervision by the UAV flying at a low velocity | 3.70 | 0.85 | 4.00 | 1.00 |
V.Sup2: I feel that there is too much supervision by the UAV flying at a medium velocity | 3.32 | 0.78 | 3.00 | 1.00 |
V.Sup3: I feel that there is too much supervision by the UAV flying at a high velocity | 2.97 | 0.80 | 3.00 | 0.00 |
Distraction | ||||
V.Dis1: I think the UAV would distract me from my daily routine by flying at a low velocity | 3.54 | 1.37 | 4.00 | 3.00 |
V.Dis2: I think the UAV would distract me from my daily routine by flying at a medium velocity | 3.22 | 1.16 | 3.00 | 1.00 |
V.Dis3: I think the UAV would distract me from my daily routine by flying at a high velocity | 3.35 | 1.23 | 4.00 | 2.00 |
Adequacy | ||||
V.Ade1: I found the low velocity of the UAV adequate | 3.03 | 1.32 | 3.00 | 2.00 |
V.Ade2: I found the medium velocity of the UAV adequate | 3.49 | 0.87 | 4.00 | 1.00 |
V.Ade3: I found the high velocity of the UAV adequate | 3.30 | 1.15 | 3.00 | 2.00 |
Trajectory | ||||
Safety | ||||
T.Saf1: I felt safe during the monitoring process by the UAV flying elliptically (closer to face) | 2.70 | 1.10 | 2.00 | 2.00 |
T.Saf2: I felt safe during the monitoring process by the UAV flying following a circular trajectory | 3.89 | 0.81 | 4.00 | 1.00 |
T.Saf3: I felt safe during the monitoring process by the UAV flying elliptically (farther from face) | 3.68 | 0.94 | 4.00 | 1.00 |
Supervision | ||||
T.Sup1: I feel that there is too much supervision by the UAV flying elliptically (closer to face) | 3.51 | 1.07 | 3.00 | 1.00 |
T.Sup2: I feel that there is too much supervision by the UAV flying following a circular trajectory | 2.89 | 0.70 | 3.00 | 1.00 |
T.Sup3: I feel that there is too much supervision by the UAV flying elliptically (farther from face) | 2.89 | 0.81 | 3.00 | 1.00 |
Distraction | ||||
T.Dis1: I think the UAV would distract me from my daily routine by flying elliptically (closer to face) | 3.86 | 1.03 | 4.00 | 1.00 |
T.Dis2: I think the UAV would distract me from my daily routine by flying following a circular trajectory | 2.84 | 0.99 | 3.00 | 2.00 |
T.Dis3: I think the UAV would distract me from my daily routine by flying elliptically (farther from face) | 2.95 | 1.05 | 3.00 | 2.00 |
Adequacy | ||||
T.Ade1: I found the UAV’s monitoring distance to be correct by the UAV flying elliptically (closer to face) | 2.68 | 1.20 | 2.00 | 2.00 |
T.Ade2: I found the UAV’s monitoring distance to be correct by the UAV flying following a circular trajectory | 3.84 | 0.69 | 4.00 | 1.00 |
T.Ade3: I found the UAV’s monitoring distance to be correct by the UAV flying elliptically (farther from face) | 3.51 | 1.07 | 4.00 | 1.00 |
Question | Mean | SD | Median | IQR |
---|---|---|---|---|
Altitude | ||||
Safety | ||||
A.Saf1 | 2.43 | 1.08 | 2.00 | 1.00 |
A.Saf2 | 2.39 | 1.20 | 2.00 | 2.00 |
A.Saf3 | 2.91 | 0.95 | 3.00 | 2.00 |
Supervision | ||||
A.Sup1 | 3.30 | 0.97 | 3.00 | 1.00 |
A.Sup2 | 3.43 | 1.16 | 4.00 | 1.00 |
A.Sup3 | 2.74 | 1.05 | 3.00 | 1.50 |
Distraction | ||||
A.Dis1 | 3.74 | 1.32 | 4.00 | 2.50 |
A.Dis2 | 4.26 | 0.96 | 5.00 | 1.00 |
A.Dis3 | 2.35 | 0.98 | 2.00 | 1.00 |
Velocity | ||||
Safety | ||||
V.Saf1 | 3.70 | 1.02 | 4.00 | 1.00 |
V.Saf2 | 3.83 | 0.94 | 4.00 | 1.50 |
V.Saf3 | 2.78 | 1.24 | 2.00 | 2.00 |
Supervision | ||||
V.Sup1 | 2.74 | 1.05 | 3.00 | 2.00 |
V.Sup2 | 3.17 | 1.07 | 3.00 | 1.50 |
V.Sup3 | 3.22 | 1.13 | 4.00 | 2.00 |
Distraction | ||||
V.Dis1 | 3.83 | 0.89 | 4.00 | 1.50 |
V.Dis2 | 3.61 | 0.66 | 4.00 | 1.00 |
V.Dis3 | 2.87 | 0.81 | 4.00 | 2.00 |
Adequacy | ||||
V.Ade1 | 3.74 | 1.39 | 4.00 | 2.00 |
V.Ade2 | 3.48 | 1.16 | 3.00 | 1.00 |
V.Ade3 | 3.43 | 1.12 | 4.00 | 1.00 |
Trajectory | ||||
Safety | ||||
T.Saf1 | 2.74 | 1.14 | 3.00 | 2.00 |
T.Saf2 | 3.74 | 0.86 | 4.00 | 1.00 |
T.Saf3 | 3.74 | 1.01 | 4.00 | 0.50 |
Supervision | ||||
T.Sup1 | 2.65 | 1.23 | 2.00 | 1.50 |
T.Sup2 | 3.87 | 0.76 | 4.00 | 0.50 |
T.Sup3 | 3.74 | 1.01 | 4.00 | 1.00 |
Distraction | ||||
T.Dis1 | 3.30 | 1.11 | 3.00 | 1.50 |
T.Dis2 | 2.87 | 0.76 | 4.00 | 0.50 |
T.Dis3 | 2.87 | 0.87 | 3.00 | 1.00 |
Adequacy | ||||
T.Ade1 | 2.74 | 1.29 | 3.00 | 2.00 |
T.Ade2 | 2.83 | 0.94 | 3.00 | 2.00 |
T.Ade3 | 3.74 | 0.92 | 4.00 | 1.00 |
Younger Adults | Older Adults | ||
---|---|---|---|
Altitude | |||
Preference | High altitude | High altitude | |
Safety | High altitude safer than medium and low altitudes | No significant difference | |
Supervision | No significant difference | No significant difference | |
Distraction | Medium and low altitudes distract more than high altitude | Medium and low altitudes distract more than high altitude | |
Velocity | |||
Preference | High velocity | High velocity | |
Safety | Lower velocity safer than high velocity | Lower and medium velocities safer than high velocity | |
Supervision | Low velocity has more perceived supervision than high velocity | No significant difference | |
Distraction | No significant difference | Lower velocity distracts more than high velocity | |
Adequacy | No significant difference | No significant difference | |
Trajectory | |||
Preference | Circular trajectory | Circular trajectory | |
Safety | Circular and elliptical trajectory farther from the user’s face safer than elliptical trajectory closer to the user’s face | Circular and elliptical trajectory farther from the user’s face safer than elliptical trajectory closer to the user’s face | |
Supervision | Elliptical trajectory close to the user has more perceived supervision than circular and elliptical trajectory far from the user | Higher supervision for elliptical trajectory close to the user than circular and elliptical trajectory far from the user | |
Distraction | Elliptical trajectory close to the users distracts more than circular and elliptical trajectory farther from the user’s face | No significant difference | |
Adequacy | Elliptical trajectory far from the user and circular trajectory are more adequate than elliptical trajectory closer to the user’s face | Elliptical trajectory far from the user more adequate than circular and elliptical trajectory closer to the user’s face |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Belmonte, L.M.; García, A.S.; Morales, R.; de la Vara, J.L.; López de la Rosa, F.; Fernández-Caballero, A. Feeling of Safety and Comfort towards a Socially Assistive Unmanned Aerial Vehicle That Monitors People in a Virtual Home. Sensors 2021, 21, 908. https://doi.org/10.3390/s21030908
Belmonte LM, García AS, Morales R, de la Vara JL, López de la Rosa F, Fernández-Caballero A. Feeling of Safety and Comfort towards a Socially Assistive Unmanned Aerial Vehicle That Monitors People in a Virtual Home. Sensors. 2021; 21(3):908. https://doi.org/10.3390/s21030908
Chicago/Turabian StyleBelmonte, Lidia M., Arturo S. García, Rafael Morales, Jose Luis de la Vara, Francisco López de la Rosa, and Antonio Fernández-Caballero. 2021. "Feeling of Safety and Comfort towards a Socially Assistive Unmanned Aerial Vehicle That Monitors People in a Virtual Home" Sensors 21, no. 3: 908. https://doi.org/10.3390/s21030908
APA StyleBelmonte, L. M., García, A. S., Morales, R., de la Vara, J. L., López de la Rosa, F., & Fernández-Caballero, A. (2021). Feeling of Safety and Comfort towards a Socially Assistive Unmanned Aerial Vehicle That Monitors People in a Virtual Home. Sensors, 21(3), 908. https://doi.org/10.3390/s21030908