AMiCUS—A Head Motion-Based Interface for Control of an Assistive Robot
<p>Kinematics of the cervical spine. From the kinematic point of view, the human cervical spine can be approximated by a ball joint. That means, every motion can be divided into single rotations around three orthogonal axes that intersect in one point. This point, that is, the center of rotation, roughly coincides with the location of the thyroid gland. As a result, a rigid body placed onto a human head moves on a spherical surface during head motion.</p> "> Figure 2
<p>Coordinate systems of the AMiCUS system. Degrees of Freedom (DOFs) of the same color are controlled by the same head DOF. The zero orientation of the head coordinate system depends on whether the cursor or the robot is controlled. During robot control, the head coordinate system is denoted by <math display="inline"><semantics> <mrow> <msup> <mrow/> <msub> <mi>h</mi> <mi>r</mi> </msub> </msup> <mi mathvariant="bold">α</mi> <mo>=</mo> <msup> <mo> </mo> <msub> <mi>h</mi> <mi>r</mi> </msub> </msup> <mrow> <mo>(</mo> <mi>φ</mi> <mo>,</mo> <mi>ϑ</mi> <mo>,</mo> <mi>ψ</mi> <mo>)</mo> </mrow> <msup> <mrow/> <mi>T</mi> </msup> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <msup> <mrow/> <msub> <mi>h</mi> <mi>c</mi> </msub> </msup> <mi mathvariant="bold">α</mi> <mo>=</mo> <msup> <mo> </mo> <msub> <mi>h</mi> <mi>c</mi> </msub> </msup> <mrow> <mo>(</mo> <mi>φ</mi> <mo>,</mo> <mi>ϑ</mi> <mo>,</mo> <mi>ψ</mi> <mo>)</mo> </mrow> <msup> <mrow/> <mi>T</mi> </msup> </mrow> </semantics></math> during cursor control.</p> "> Figure 3
<p>Mapping of head DOFs onto robot DOFs. Four different groups, that is, Gripper, Orientation, Vertical Plane and Horizontal Plane, are depicted. The user is able to switch between groups in order to control all DOFs of the robot.</p> "> Figure 4
<p>Robot control transfer function. A G<span class="html-small-caps">ompertz</span>-function is used as transfer function. The used parameters are <math display="inline"><semantics> <mrow> <msub> <mi>A</mi> <mo movablelimits="true" form="prefix">max</mo> </msub> <mo>=</mo> <mn>1</mn> </mrow> </semantics></math>, <math display="inline"><semantics> <mrow> <mi>δ</mi> <mo>=</mo> <mo>−</mo> <mn>30</mn> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <mi>r</mi> <mo>=</mo> <mo>−</mo> <mn>10</mn> </mrow> </semantics></math>. The space between the dashed lines indicates the deadzone.</p> "> Figure 5
<p>Graphical User Interface during Robot Control Mode. The GUI displays an icon of the current robot group (top right), the image of the gripper camera (bottom right), an information line (bottom) and feedback about the current head angle given by the arrow (left). The square represents the deadzone in which the robot arm cannot be moved.</p> "> Figure 6
<p>Head Gesture. The gesture is displayed with its <math display="inline"><semantics> <mrow> <msup> <mrow/> <msub> <mi>h</mi> <mi>r</mi> </msub> </msup> <mi>ϑ</mi> </mrow> </semantics></math>-angles over time <span class="html-italic">t</span>. Parameters: <math display="inline"><semantics> <msub> <mi>d</mi> <mo movablelimits="true" form="prefix">max</mo> </msub> </semantics></math> = amplitude, <span class="html-italic">w</span> = peak width, <math display="inline"><semantics> <msub> <mi>t</mi> <mi>c</mi> </msub> </semantics></math> = location of the peak center.</p> "> Figure 7
<p>Graphical User Interface during Cursor Control Mode. The GUI contains one Slide Button for each robot group. The dwell buttons in the top toolbar allow the user to perform several actions, such as pausing control, starting calibration routines or exiting the program.</p> "> Figure 8
<p>Slide Button. The following steps are necessary for successful activation: When the button is in its neutral state (<math display="inline"><semantics> <msub> <mi>S</mi> <mn>0</mn> </msub> </semantics></math>) the mouse cursor has to dwell in the button (<math display="inline"><semantics> <msub> <mi>S</mi> <mn>1</mn> </msub> </semantics></math>) until visual and acoustic feedback occurs. Then, the button has to be moved to the right along the rail (<math display="inline"><semantics> <msub> <mi>S</mi> <mn>2</mn> </msub> </semantics></math>). At the end of the rail visual and acoustic feedback is given. Next, the button has to be moved to the left along the rail (<math display="inline"><semantics> <msub> <mi>S</mi> <mn>3</mn> </msub> </semantics></math>). When the button reaches the initial position (<math display="inline"><semantics> <msub> <mi>S</mi> <mn>4</mn> </msub> </semantics></math>), it is activated and the assigned action is performed.</p> "> Figure 9
<p>Experimental setup of the predefined task. The subjects were clearly instructed how to move the robot. Movements 1–3 had to be performed in the Vertical Plane group, movements 4–6 in the Horizontal Plane group. After movement 6, the subjects had to perform one 90<math display="inline"><semantics> <msup> <mrow/> <mo>∘</mo> </msup> </semantics></math>-rotation around each rotation axis.</p> "> Figure 10
<p>Experimental setup of the complex task. The users had to find their own control strategy to solve the task.</p> "> Figure 11
<p>Completion rate of the complex task. There was no statistical difference between the users with full and restricted ROM. The overall completion rate of the complex task was <math display="inline"><semantics> <mrow> <mn>72.2</mn> </mrow> </semantics></math>% (n = 18).</p> "> Figure 12
<p>Comparison between Slide Button and Head Gesture. The performance of the Slide Button and the Head Gesture was evaluated in terms of success rate and activation time.</p> "> Figure 13
<p>Control speed evaluation. Mean and standard deviation of subjective velocities during gripping, linear motion and rotations (n = 18).</p> ">
Abstract
:1. Introduction
- The HRI should be adaptive, always using the full available neck range of motion of the user.
- The relationship between performed head motion and resulting robot motion has to be intuitive.
- The HRI must reliably distinguish between unintended head motion, head motion intended for direct control and head motion to generate switching commands.
- The HRI has to give sufficient and useful feedback to the user to allow safe and efficient operation.
- The HRI must enable the user to perform smooth, precise and efficient robot movements in Cartesian space.
- The user should enjoy using the HRI.
2. AMiCUS
2.1. Relation to Previously Published Work
2.2. Sensor Placement
2.3. Hardware
2.4. Robot Groups
2.5. Robot Control Mode
Head Gesture
- The neck was flexed sufficiently ( ).
- The gesture was performed quickly enough ( ).
- The gesture was suffiently Gaussian-shaped ().
- The maximum - and -angles did not exceed 80% of
2.6. Cursor Control Mode
Slide Button
2.7. Calibration Routines
2.7.1. Robot Calibration
2.7.2. Cursor Calibration
3. Materials and Methods
3.1. Subjects
3.2. Experimental Setup
3.3. Procedure
3.3.1. Trial Session
3.3.2. Predefined Task
3.3.3. Complex Task
3.4. Evaluation Criteria
3.4.1. Objective Evaluation
3.4.2. Subjective Evaluation
4. Results and Discussion
4.1. Objective Evaluation
4.1.1. Completion Rates
4.1.2. Success Rates and Activation Times
4.2. Subjective Evaluation
4.2.1. Calibration
4.2.2. GUI
4.2.3. Switching
4.2.4. Mapping
4.2.5. Transfer Function
4.2.6. General
5. Conclusions
5.1. Compliance with Requirements for a Head Motion-Based HRI
5.2. Comparison with Previously Published Work
Supplementary Materials
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Abbreviations
AHRS | Attitude Heading Reference System |
AMiCUS | Adaptive Head Motion Control for User–friendly Support |
DOF | Degree of Freedom |
GUI | Graphical User Interface |
HMI | Human–Machine Interface |
HRI | Human–Robot Interface |
MEMS | Micro Electro-Mechanical System |
ROM | Range of Motion |
TCP | Tool–Center Point |
References
- Kirshblum, S.C.; Burns, S.P.; Biering-Sorensen, F.; Donovan, W.; Graves, D.E.; Jha, A.; Johansen, M.; Jones, L.; Krassioukov, A.; Mulcahey, M.; et al. International Standards for Neurological Classification of Spinal Cord Injury (Revised 2011). J. Spinal Cord Med. 2011, 34, 535–546. [Google Scholar] [CrossRef] [PubMed]
- McDonald, J.W.; Sadowsky, C. Spinal-Cord Injury. Lancet 2002, 359, 417–425. [Google Scholar] [CrossRef]
- Wyndaele, M.; Wyndaele, J.J. Incidence, Prevalence and Epidemiology of Spinal Cord Injury: What Learns a Worldwide Literature Survey? Spinal Cord 2006, 44, 523–529. [Google Scholar] [CrossRef]
- Alsharif, S. Gaze-Based Control of Robot Arm in Three-Dimensional Space. Ph.D. Thesis, University of Bremen, Bremen, Germany, 2018. [Google Scholar]
- Raya, R.; Rocon, E.; Ceres, R.; Pajaro, M. A Mobile Robot Controlled by an Adaptive Inertial Interface for Children with Physical and Cognitive Disorders. In Proceedings of the 2012 IEEE International Conference on Technologies for Practical Robot Applications (TePRA), Woburn, MA, USA, 23–24 April 2012; pp. 151–156. [Google Scholar] [CrossRef]
- Lontis, E.R.; Struijk, L.N.S.A. Alternative Design of Inductive Pointing Device for Oral Interface for Computers and Wheelchairs. In Proceedings of the 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Diego, CA, USA, 28 August–1 September 2012; pp. 3328–3331. [Google Scholar] [CrossRef]
- Rourke, M.; Clough, R.; Brackett, P. Method and Means of Voice Control of a Computer, Including Its Mouse and Keyboard. U.S. Patent 6,668,244, 23 December 2003. [Google Scholar]
- Origin Instruments Corporation. Sip/Puff Switch User Guide. 2019. Available online: http://www.orin.com/access/docs/SipPuffSwitchUserGuide.pdf (accessed on 16 May 2019).
- Graimann, B.; Allison, B.; Mandel, C.; Lüth, T.; Valbuena, D.; Gräser, A. Non-invasive Brain-Computer Interfaces for Semi-autonomous Assistive Devices. In Robust Intelligent Systems; Springer: London, UK, 2008; pp. 113–138. [Google Scholar]
- Moon, I.; Lee, M.; Chu, J.; Mun, M. Wearable EMG-Based HCI for Electric-Powered Wheelchair Users with Motor Disabilities. In Proceedings of the 2005 IEEE International Conference on Robotics and Automation, Barcelona, Spain, 18–22 April 2005; pp. 2649–2654. [Google Scholar] [CrossRef]
- Huang, C.N.; Chen, C.H.; Chung, H.Y. Application of Facial Electromyography in Computer Mouse Access for People with Disabilities. Disabil. Rehabil. 2006, 28, 231–237. [Google Scholar] [CrossRef]
- Rupp, R.; Schmalfuß, L.; Tuga, M.; Kogut, A.; Hewitt, M.; Meincke, J.; Duttenhöfer, W.; Eck, U.; Mikut, R.; Reischl, M.; et al. TELMYOS—A Telemetric Wheelchair Control Interface Based on the Bilateral Recording of Myoelectric Signals from Ear Muscles. In Proceedings of the Technically Assisted Rehabilitation Conference (TAR 2015), Berlin, Germany, 12–13 March 2015. [Google Scholar]
- Prentke Romich International, Ltd. Operator’s Manual for the HeadMasterPlus. 1995. Available online: http://file.prentrom.com/122/HeadMaster-Plus-Manual.pdf (accessed on 16 May 2019).
- La Cascia, M.; Sclaroff, S.; Athitsos, V. Fast, Reliable Head Tracking Under Varying Illumination: An Approach Based on Registration of Texture-mapped 3D Models. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 322–336. [Google Scholar] [CrossRef]
- BJ Adaptaciones. BJOY Chin. 2019. Available online: http://support.bjliveat.com/en/productos/bj-853-en/ (accessed on 16 May 2019).
- Raya, R.; Roa, J.; Rocon, E.; Ceres, R.; Pons, J. Wearable Inertial Mouse for Children with Physical and Cognitive Impairments. Sens. Actuators A Phys. 2010, 162, 248–259. [Google Scholar] [CrossRef]
- Bureau, M.; Azkoitia, J.; Ezmendi, G.; Manterola, I.; Zabaleta, H.; Perez, M.; Medina, J. Non-invasive, Wireless and Universal Interface for the Control of Peripheral Devices by Means of Head Movements. In Proceedings of the IEEE 10th International Conference on Rehabilitation Robotics (ICORR), Noordwijk, The Netherlands, 12–15 June 2007; pp. 124–131. [Google Scholar]
- Chen, Y.L. Application of Tilt Sensors in Human-Computer Mouse Interface for People with Disabilities. IEEE Trans. Neural Syst. Rehabil. Eng. 2001, 9, 289–294. [Google Scholar] [CrossRef] [PubMed]
- Quha oy. Quha Zono Gyroscopic Mouse. 2019. Available online: http://www.quha.com/products-2/zono/ (accessed on 16 May 2019).
- Music, J.; Cecic, M.; Bonkovic, M. Testing Inertial Sensor Performance as Hands-Free Human-Computer Interface. WSEAS Trans. Comput. 2009, 8, 715–724. [Google Scholar]
- Mandel, C.; Röfer, T.; Frese, U. Applying a 3DOF Orientation Tracker as a Human–Robot Interface for Autonomous Wheelchairs. In Proceedings of the IEEE 10th International Conference on Rehabilitation Robotics (ICORR 2007), Noordwijk, The Netherlands, 12–15 June 2007; pp. 52–59. [Google Scholar] [CrossRef]
- Williams, M.R.; Kirsch, R.F. Evaluation of Head Orientation and Neck Muscle EMG Signals as Three-dimensional Command Sources. J. Neuroeng. Rehabil. 2015, 12, 25. [Google Scholar] [CrossRef] [PubMed]
- Fall, C.; Turgeon, P.; Campeau-Lecours, A.; Maheu, V.; Boukadoum, M.; Roy, S.; Massicotte, D.; Gosselin, C.; Gosselin, B. Intuitive Wireless Control of a Robotic Arm for People Living with an Upper Body Disability. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 4399–4402. [Google Scholar]
- Kinova Inc. KINOVA JACO—Assistive Robotic Arm. 2019. Available online: https://www.kinovarobotics.com/en/products/assistive-technologies/kinova-jaco-assistive-robotic-arm (accessed on 16 May 2019).
- Rudigkeit, N.; Gebhard, M.; Gräser, A. Evaluation of Control Modes for Head Motion-Based Control with Motion Sensors. In Proceedings of the 2015 IEEE International Symposium on Medical Measurements and Applications (MeMeA 2015), Torino, Italy, 7–9 May 2015. [Google Scholar] [CrossRef]
- Rudigkeit, N.; Gebhard, M.; Gräser, A. Towards a User-Friendly AHRS-Based Human-Machine Interface for a Semi-Autonomous Robot. In Proceedings of the 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS2014), Chicago, IL, USA, 14–18 September 2014. [Google Scholar]
- Rudigkeit, N.; Gebhard, M.; Gräser, A. A Novel Interface for Intuitive Control of Assistive Robots Based on Inertial Measurement Units. In Ambient Assisted Living: 8. AAL Kongress 2015, Frankfurt/M, April 29–30, 2015; Wichert, R., Klausing, H., Eds.; Springer International Publishing: Frankfurt/Main, Germany, 2016; pp. 137–146. [Google Scholar] [CrossRef]
- Rudigkeit, N.; Gebhard, M.; Gräser, A. An Analytical Approach for Head Gesture Recognition with Motion Sensors. In Proceedings of the 2015 Ninth International Conference on Sensing Technology (ICST), Auckland, New Zealand, 8–10 December 2015. [Google Scholar] [CrossRef]
- Jackowski, A.; Gebhard, M.; Thietje, R. Head Motion and Head Gesture-Based Robot Control: A Usability Study. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 161–170. [Google Scholar] [CrossRef] [PubMed]
- Zhang, L. Investigation of Coupling Patterns of the Cervical Spine. Master’s Thesis, University of Dundee, Dundee, UK, 2014. [Google Scholar]
- Hillcrest Laboratories, Inc. FSM-9 Modules, 2019. Available online: https://www.hillcrestlabs.com/products/fsm-9 (accessed on 16 May 2019).
- The Qt Company. Qt|Cross-Platform Software Development for Embedded & Desktop. 2019. Available online: https://www.qt.io/ (accessed on 16 May 2019).
- Universal Robots A/S. UR 5 Technical Specifications. 2016. Available online: http://www.universal-robots.com/media/50588/ur5_en.pdf (accessed on 16 May 2019).
- Robotiq Inc. Robotiq 2F-85 & 2F-140 for CB-Series Universal Robots—Instruction Manual. 2019. Available online: https://assets.robotiq.com/website-assets/support_documents/document/2F-85_2F-140_Instruction_Manual_CB-Series_PDF_20190329.pdf (accessed on 16 May 2019).
- Logitech Inc. Webcam C930e Data Sheet. 2017. Available online: http://www.logitech.com/assets/64665/c930edatasheet.ENG.pdf (accessed on 16 May 2019).
- Wikipedia Contributors. Student’s t-Test—Wikipedia, The Free Encyclopedia. 2019. Available online: https://en.wikipedia.org/w/index.php?title=Student%27s_t-test&oldid=898465472 (accessed on 16 May 2019).
- Du Prel, J.B.; Röhrig, B.; Hommel, G.; Blettner, M. Choosing Statistical Tests: Part 12 of a Series on Evaluation of Scientific Publications. Deutsch. Ärztebl. Int. 2010, 107, 343–348. [Google Scholar] [CrossRef]
- Wikipedia Contributors. Friedman Test—Wikipedia, The Free Encyclopedia. 2019. Available online: https://en.wikipedia.org/w/index.php?title=Friedman_test&oldid=890448021 (accessed on 16 May 2019).
- Wikipedia Contributors. Fisher’s Exact Test—Wikipedia, The Free Encyclopedia. 2019. Available online: https://en.wikipedia.org/w/index.php?title=Fisher%27s_exact_test&oldid=891629270 (accessed on 16 May 2019).
- Wikipedia Contributors. Mann-Whitney U Test—Wikipedia, The Free Encyclopedia. 2019. Available online: https://en.wikipedia.org/w/index.php?title=Mann%E2%80%93Whitney_U_test&oldid=899337823 (accessed on 16 May 2019).
Setting | Value | Meaning |
---|---|---|
Samplerate | 125 | |
Operating Mode | 4 | Full Motion on |
Packet Select | 8 | Motion Engine output |
Format Select | 0 | Format 0 packet |
FF2 | true | Enable output of linear acceleration, no gravity |
FF6 | true | Enable output of angular position |
Topic | No. | Statement | Rating |
---|---|---|---|
Calibration | 1 | Understanding what one is supposed to do during Cursor Calibration is easy | |
2 | Understanding what one is supposed to do during Robot Calibration is easy | ||
3 | The acoustic feedback during calibration is useful | ||
GUI | 4 | The Cursor GUI is visually appealing and clearly structured | |
5 | In Cursor Mode, all important functions can be accessed easily | ||
6 | The Robot GUI is visually appealing and clearly structured | ||
7 | In Robot Mode, all important information is displayed on the GUI | ||
8 | The feedback about the current head position is easy to understand and useful | ||
9 | The camera image is useful | ||
Switching | 10 | Activating the Slide Button is easy | |
11 | The acoustic and visual feedback for the Slide Button is useful | ||
12 | Performing the Head Gesture correctly is easy | ||
13 | The feedback about Head Gesture execution is useful | ||
14 | Switching between groups is easy | ||
15 | Switching between groups is quick | ||
Mapping | 16 | I can well imagine what the gripper does when I move my head up or down | |
17 | I can well imagine what the robot does in Vertical Plane group when I move my head up or down | ||
18 | I can well imagine what the robot does in Vertical Plane or Horizontal Plane group when I turn my head to the left or to the right | ||
19 | I can well imagine what the robot does in Horizontal Plane group when I move my head up or down | ||
20 | I can well imagine what the robot does in Orientation group when I move my head up or down | ||
21 | I can well imagine what the robot does in Orientation group when I turn my head to the left or to the right | ||
22 | I can well imagine what the robot does in Orientation group when I bend my head to the left or to the right | ||
Transfer function | 23 | Moving the mouse cursor precisely is easy | |
24 | Gripping precisely is easy | ||
25 | Moving the robot arm precisely is easy | ||
General | 26 | Assessing robot position correctly in Vertical Plane group is easy | |
27 | Assessing robot position correctly in Horizontal Plane group is easy | ||
28 | Assessing robot orientation correctly is easy | ||
29 | I can easily keep an eye an all relevant parts of the system | ||
30 | Robot control is fun |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Rudigkeit, N.; Gebhard, M. AMiCUS—A Head Motion-Based Interface for Control of an Assistive Robot. Sensors 2019, 19, 2836. https://doi.org/10.3390/s19122836
Rudigkeit N, Gebhard M. AMiCUS—A Head Motion-Based Interface for Control of an Assistive Robot. Sensors. 2019; 19(12):2836. https://doi.org/10.3390/s19122836
Chicago/Turabian StyleRudigkeit, Nina, and Marion Gebhard. 2019. "AMiCUS—A Head Motion-Based Interface for Control of an Assistive Robot" Sensors 19, no. 12: 2836. https://doi.org/10.3390/s19122836
APA StyleRudigkeit, N., & Gebhard, M. (2019). AMiCUS—A Head Motion-Based Interface for Control of an Assistive Robot. Sensors, 19(12), 2836. https://doi.org/10.3390/s19122836