A Comparison of Head Movement Classification Methods
<p>Paradigm schematic and example of data recorded via headset. (<b>A</b>) An example of the virtual environment experienced by participants during the search task as well as examples of high and low spatial frequency targets used. (<b>B</b>) Potential motion trajectories of target disks. Disks could move smoothly in the Pursuit condition or instantaneously appear at a new location in the Instantaneous condition. The dotted line circles represent potential locations that the disk could either appear at or move to and were not visible in the actual experiment. In the bottom of the panel is an example of the high and low spatial frequency Gabor patches used. (<b>C</b>) A graphic depicting how data are collected and analyzed. Unity tracks head position and rotation from the center of the headset using a left-handed coordinate system where the Y-axis is up/down, the X-axis is left/right, and the Z-axis is backwards/forwards. Head rotation data are plotted for each axis during an example trial. This trial was a Pursuit trial where the target smoothly moved in the 225° direction, to an eccentricity of 50° (see panel B). Snapshots illustrate how these data translate into head rotation at the three time points throughout the trial. Using the rotation data around each axis, the angular distance over time can be calculated, providing us with an angular head speed (plotted in purple on the right y-axis of the line graph).</p> "> Figure 2
<p>The GUI developed to enable human raters to code head movements. Raters were presented with head angular speed and magnitude over time. Trial information was conveyed through the colored diamond shapes displayed at the bottom of the graph (green diamonds for the onset of fixation, purple triangles for target presentation and response, and red diamonds for the end of the trial). Raters clicked the times at which they judge a head movement as having started and finished. The GUI highlights these windows with green rectangles. The red crosses indicate the positions where the rater clicked to indicate head movement onsets and offsets.</p> "> Figure 3
<p>Examples of head angle magnitude over time in Instant (<b>left</b>) and Pursuit (<b>right</b>) trials. The head angle plotted is the angular difference between the head’s forward vector and world’s forward vector. Time windows where head movements were classified by each method are highlighted in red with onset and offset marked by vertical dotted lines for comparison.</p> "> Figure 4
<p>Peak amplitude by peak velocity (also referred to as main sequence plots) of head movements classified by various algorithms for Instant and Pursuit trials.</p> "> Figure 5
<p>Starting (<b>top row</b>) and ending (<b>bottom row</b>) head unit vector rotations for every head movement in a trial. Head movements are color coded based on the direction in which the disk moved in that trial.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. Ethics Statement
2.2. Participants
2.3. Apparatus
2.4. Calibration
2.5. Procedure
2.6. Head Movement Classification Methods
2.6.1. Human Raters
2.6.2. Baseline Movement Adaptive Threshold (BMAT)
2.6.3. Smoothed Velocity Threshold (SVT)
2.6.4. Chen and Walton
2.6.5. Differential IMU-like Zero Crossing Observation (DIZCO)
2.6.6. Decision Tree
2.7. Metrics of Algorithm Comparison
2.7.1. Cohen’s Kappa
2.7.2. Other Agreement Metrics
3. Results
4. Discussion
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Sidenmark, L.; Gellersen, H. Eye, Head and Torso Coordination during Gaze Shifts in Virtual Reality. ACM Trans. Comput.-Hum. Interact. 2019, 27, 1–40. [Google Scholar] [CrossRef]
- Andersson, R.; Larsson, L.; Holmqvist, K.; Stridh, M.; Nyström, M. One algorithm to rule them all? An evaluation and discussion of ten eye movement event-detection algorithms. Behav. Res. Methods 2017, 49, 616–637. [Google Scholar] [CrossRef]
- Hwang, A.D.; Wang, H.-C.; Pomplun, M. Semantic guidance of eye movements in real-world scenes. Vis. Res. 2011, 51, 1192–1205. [Google Scholar] [CrossRef] [PubMed]
- Unema, P.J.A.; Pannasch, S.; Joos, M.; Velichkovsky, B.M. Time course of information processing during scene perception: The relationship between saccade amplitude and fixation duration. Vis. Cogn. 2005, 12, 473–494. [Google Scholar] [CrossRef]
- Stuyven, E.; Van der Goten, K.; Vandierendonck, A.; Claeys, K.; Crevits, L. The effect of cognitive load on saccadic eye movements. Acta Psychol. 2000, 104, 69–85. [Google Scholar] [CrossRef]
- Ito, A.; Pickering, M.J.; Corley, M. Investigating the time-course of phonological prediction in native and non-native speakers of English: A visual world eye-tracking study. J. Mem. Lang. 2018, 98, 1–11. [Google Scholar] [CrossRef]
- Cecala, A.L.; Freedman, E.G. Amplitude changes in response to target displacements during human eye–head movements. Vis. Res. 2008, 48, 149–166. [Google Scholar] [CrossRef] [PubMed]
- Fetter, M. Vestibulo-Ocular Reflex. In Developments in Ophthalmology; Karger: Basel, Switzerland, 2007. [Google Scholar] [CrossRef]
- Solman, G.J.F.; Foulsham, T.; Kingstone, A. Eye and head movements are complementary in visual selection. R. Soc. Open Sci. 2017, 4, 160569. [Google Scholar] [CrossRef]
- David, E.; Beitner, J.; Võ, M.L.-H. Effects of Transient Loss of Vision on Head and Eye Movements during Visual Search in a Virtual Environment. Brain Sci. 2020, 10, 841. [Google Scholar] [CrossRef]
- Stahl, J.S. Adaptive plasticity of head movement propensity. Exp. Brain Res. 2001, 139, 201–208. [Google Scholar] [CrossRef]
- Hardiess, G.; Gillner, S.; Mallot, H.A. Head and eye movements and the role of memory limitations in a visual search paradigm. J. Vis. 2008, 8, 7. [Google Scholar] [CrossRef] [PubMed]
- Lee, C. Eye and head coordination in reading: Roles of head movement and cognitive control. Vis. Res. 1999, 39, 3761–3768. [Google Scholar] [CrossRef] [PubMed]
- Dar, A.H.; Wagner, A.S.; Hanke, M. REMoDNaV: Robust eye-movement classification for dynamic stimulation. Behav. Res. Methods 2021, 53, 399–414. [Google Scholar] [CrossRef]
- Swan, G.; Goldstein, R.B.; Savage, S.W.; Zhang, L.; Ahmadi, A.; Bowers, A.R. Automatic processing of gaze movements to quantify gaze scanning behaviors in a driving simulator. Behav. Res. Methods 2021, 53, 487–506. [Google Scholar] [CrossRef] [PubMed]
- Munn, S.M.; Stefano, L.; Pelz, J.B. Fixation-identification in dynamic scenes: Comparing an automated algorithm to manual coding. In Proceedings of the 5th Symposium on Applied Perception in Graphics and Visualization, in APGV ’08, Los Angeles, CA, USA, 9–10 August 2008; Association for Computing Machinery: New York, NY, USA, 2008; pp. 33–42. [Google Scholar]
- Kothe, C.; Medine, D.; Boulay, C.; Grivich, M.; Stenner, T. Lab Streaming Layer. 2014. Available online: https://github.com/sccn/labstreaminglayer (accessed on 15 February 2024).
- Engbert, R.; Kliegl, R. Microsaccades uncover the orientation of covert attention. Vis. Res. 2003, 43, 1035–1045. [Google Scholar] [CrossRef]
- Nyström, M.; Holmqvist, K. An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data. Behav. Res. Methods 2010, 42, 188–204. [Google Scholar] [CrossRef]
- Chen, L.L.; Walton, M.M.G. Head Movement Evoked By Electrical Stimulation in the Supplementary Eye Field of the Rhesus Monkey. J. Neurophysiol. 2005, 94, 4502–4519. [Google Scholar] [CrossRef]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
- Breiman, L. Classification and Regression Trees; Routledge: New York, NY, USA, 2017. [Google Scholar] [CrossRef]
- Panetta, K.; Wan, Q.; Rajeev, S.; Kaszowska, A.; Gardony, A.L.; Naranjo, K.; Taylor, H.A.; Agaian, S. ISeeColor: Method for Advanced Visual Analytics of Eye Tracking Data. IEEE Access 2020, 8, 52278–52287. [Google Scholar] [CrossRef]
- Agtzidis, I.; Startsev, M.; Dorr, M. Two hours in Hollywood: A manually annotated ground truth data set of eye move-ments during movie clip watching. J. Eye Mov. Res. 2020, 13, 1–12. [Google Scholar] [CrossRef]
- A Coefficient of Agreement for Nominal Scales—Jacob Cohen. 1960. Available online: https://journals.sagepub.com/doi/abs/10.1177/001316446002000104?casa_token=ArFWd2AxzLMAAAAA:Q6E8SawHG7lJirq98fVxTSc40wxLRxJpyP8UuZzbGg3TDmgxmU0T5jXOKjdpWXoi3n409PC9AyFE (accessed on 21 December 2023).
- Minnen, D.; Westeyn, T.; Starner, T.; Ward, J.A.; Lukowicz, P. Performance metrics and evaluation issues for continuous activity recognition. Perform. Metr. Intell. Syst. 2006, 4, 141–148. [Google Scholar]
- McHugh, M.L. Interrater reliability: The kappa statistic. Biochem. Medica 2012, 22, 276–282. [Google Scholar] [CrossRef]
- Bahill, A.; Clark, M.R.; Stark, L. The main sequence, a tool for studying human eye movements. Math. Biosci. 1975, 24, 191–204. [Google Scholar] [CrossRef]
- Hayhoe, M.; Ballard, D. Eye movements in natural behavior. Trends Cogn. Sci. 2005, 9, 188–194. [Google Scholar] [CrossRef] [PubMed]
- Kowler, E. Eye movements: The past 25 years. Vis. Res. 2011, 51, 1457–1483. [Google Scholar] [CrossRef]
- Startsev, M.; Agtzidis, I.; Dorr, M. Characterizing and automatically detecting smooth pursuit in a large-scale ground-truth data set of dynamic natural scenes. J. Vis. 2019, 19, 10. [Google Scholar] [CrossRef]
BMAT | SVT | Chen and Walton | DIZCO | Decision Tree | ||
---|---|---|---|---|---|---|
Instant Trials | Kappa | 0.63 (8.2 × 10−3) | 0.64 (6.8 × 10−3) | 0.65 (6.8 × 10−3) | 0.34 (6.2 × 10−3) | 0.48 (0.11) |
Threshold | 1.64 (0.03) | 18.41 (1.56) | 5.62 (0.07) | N/A | N/A | |
Pursuit Trials | Kappa | 0.61 (3.6 × 10−3) | 0.71 (4.0 × 10−3) | 0.62 (3.3 × 10−3) | 0.39 (0.02) | 0.59 (0.06) |
Threshold | 1.05 (6.6 × 10−3) | 9.57 (0.21) | 3.49 (0.07) | N/A | N/A |
Onset Bias | Offset Bias | Duration Bias | Amplitude Bias | ||
---|---|---|---|---|---|
Instant Trials | BMAT | −2.89 (9.97) | 21.01 (18.27) | 23.90 (23.91) | −2.41 (1.88) |
SVT | 11.04 (23.26) | 22.95 (39.92) | 11.90 (62.76) | 7.65 (5.93) | |
Chen and Walton | −2.72 (9.46) | 16.78 (14.26) | 19.50 (19.07) | −3.16 (1.62) | |
DIZCO | 20.40 (8.48) | 23.43 (15.02) | 3.03 (20.12) | 8.57 (3.53) | |
Decision Tree | −11.80 (16.53) | 27.07 (50.10) | 38.88 (65.58) | 0.13 (3.87) | |
Pursuit Trials | BMAT | −1.75 (11.25) | 27.41 (17.43) | 29.17 (22.66) | −2.53 (1.29) |
SVT | 76.49 (12.10) | −2.62 (20.53) | −79.11 (23.18) | −0.77 (1.87) | |
Chen and Walton | 7.73 (13.23) | 17.49 (17.30) | 9.76 (23.43) | −3.47 (1.22) | |
DIZCO | 17.02 (9.61) | 47.89 (13.27) | 30.87 (18.17) | 7.87 (2.09) | |
Decision Tree | −39.78 (13.62) | 61.92 (44.06) | 101.70 (54.81) | 4.85 (3.01) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Callahan-Flintoft, C.; Jensen, E.; Naeem, J.; Nonte, M.W.; Madison, A.M.; Ries, A.J. A Comparison of Head Movement Classification Methods. Sensors 2024, 24, 1260. https://doi.org/10.3390/s24041260
Callahan-Flintoft C, Jensen E, Naeem J, Nonte MW, Madison AM, Ries AJ. A Comparison of Head Movement Classification Methods. Sensors. 2024; 24(4):1260. https://doi.org/10.3390/s24041260
Chicago/Turabian StyleCallahan-Flintoft, Chloe, Emily Jensen, Jasim Naeem, Michael W. Nonte, Anna M. Madison, and Anthony J. Ries. 2024. "A Comparison of Head Movement Classification Methods" Sensors 24, no. 4: 1260. https://doi.org/10.3390/s24041260
APA StyleCallahan-Flintoft, C., Jensen, E., Naeem, J., Nonte, M. W., Madison, A. M., & Ries, A. J. (2024). A Comparison of Head Movement Classification Methods. Sensors, 24(4), 1260. https://doi.org/10.3390/s24041260