Nothing Special   »   [go: up one dir, main page]

Next Article in Journal
Reviewing Simulation Technology: Implications for Workplace Training
Previous Article in Journal
Team Success: A Mixed Methods Approach to Evaluating Virtual Team Leadership Behaviors
Previous Article in Special Issue
The Influence of Avatar Personalization on Emotions in VR
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Systematic Review

Usability Assessments for Augmented Reality Head-Mounted Displays in Open Surgery and Interventional Procedures: A Systematic Review

1
US FDA Center for Devices and Radiological Health, Silver Spring, MD 20903, USA
2
GE HealthCare, Chicago, IL 60661, USA
3
Johnson & Johnson MedTech, Cincinnati, OH 45242, USA
*
Author to whom correspondence should be addressed.
The previous address is the same with Affiliation 1.
Multimodal Technol. Interact. 2023, 7(5), 49; https://doi.org/10.3390/mti7050049
Submission received: 20 February 2023 / Revised: 24 March 2023 / Accepted: 11 April 2023 / Published: 9 May 2023
(This article belongs to the Special Issue Virtual Reality and Augmented Reality)

Abstract

:
Augmented reality (AR) head-mounted displays (HMDs) are an increasingly popular technology. For surgical applications, the use of AR HMDs to display medical images or models may reduce invasiveness and improve task performance by enhancing understanding of the underlying anatomy. This technology may be particularly beneficial in open surgeries and interventional procedures for which the use of endoscopes, microscopes, or other visualization tools is insufficient or infeasible. While the capabilities of AR HMDs are promising, their usability for surgery is not well-defined. This review identifies current trends in the literature, including device types, surgical specialties, and reporting of user demographics, and provides a description of usability assessments of AR HMDs for open surgeries and interventional procedures. Assessments applied to other extended reality technologies are included to identify additional usability assessments for consideration when assessing AR HMDs. The PubMed, Web of Science, and EMBASE databases were searched through September 2022 for relevant articles that described user studies. User assessments most often addressed task performance. However, objective measurements of cognitive, visual, and physical loads, known to affect task performance and the occurrence of adverse events, were limited. There was also incomplete reporting of user demographics. This review reveals knowledge and methodology gaps for usability of AR HMDs and demonstrates the potential impact of future usability research.

1. Introduction

Augmented reality (AR) describes the display of virtual objects integrated with the real environment. AR has been applied across various fields, including medicine, with surgical applications among the most common medical applications [1]. While interest in AR for surgical applications began as early as the 1980s [2,3,4], the development of AR systems for surgical planning and procedures has increased in recent years, driven by technical advances and proliferation of commercially available head-mounted displays (HMD) [5,6].
HMDs provide a hands-free view of virtual 2D and 3D text and images anchored to the physician’s field of view, to a specific location in the surgical suite, to objects in the environment, or to anatomical landmarks in or near the surgical site, with minimal visual occlusion of the surroundings. With these capabilities, the use of AR HMDs for surgery is purported to enhance diagnosis and surgical planning through interaction with virtual, patient-specific models [7]; facilitate intraoperative navigation by displaying medical images on or near the surgical site [8,9,10]; improve ergonomics by displaying one or more datasets in easily viewable locations [10,11,12]; and increase attention to patient vital signs and alarms compared to conventional visual displays [13,14].
While the purported benefits of AR HMD use could be widely applicable across surgical approaches, their use in open surgery and interventional procedures demands special attention. In particular, the use of AR HMDs presents an opportunity to enhance the visualization available for procedures that are not amenable to the use of endoscopes, microscopes, or other advanced visual aids. Further, maintaining an egocentric view of the surgical scene is a key feature of the technology, and it is not yet fully exploited in scope-mediated procedures. As such, the usability of AR HMDs for open surgery and interventional procedures is the focus of this work. Given the distinct visual, cognitive, and physical demands of scope-mediated procedures (e.g., dependence on a camera for visualization of the patient’s anatomy, flattening of 3D anatomy visualization to a 2D video stream, and complex transformation of hand movement to tool movement), the application of AR technologies for these procedures is not reviewed here, although it has been studied elsewhere [15,16,17,18].
Incorporating AR HMDs into conventional surgical and interventional applications is a non-trivial pursuit, requiring careful studies of physical, perceptual, and cognitive considerations that accompany the use of AR HMDs [19,20]. The potential use of AR HMDs in such high-stress and high-stakes applications as surgery necessitates a thorough understanding of their usability (i.e., the users, use cases, and user–device interactions) to ensure safety and effectiveness. To lay groundwork for future usability research and device evaluations for AR HMDs in open surgery and interventional procedures, a comprehensive description of published usability assessments and reporting for AR HMDs and related technologies is presented. As the use and evaluation of AR HMD technologies is still relatively new and may be limited, the authors included usability assessments for devices that share a subset of technical aspects or device use characteristics with AR HMDs. This broadened search included devices that display virtual information (i.e., anatomical models, markings, or annotations) beyond the standard use of monitors to display 2D medical information; devices that display any information superimposed onto the user’s field of view, whether in the main display area or an inset display; devices that project content onto the patient or the environment; and devices that utilize the principles of stereoscopy to create the illusion of depth in 2D content. The included devices expand beyond the spectrum of partial to fully virtual visualizations known as extended reality (XR); they will be referred to here as XR+. The inclusion of devices that are related to AR HMDs was intended to capture relevant usability assessments that are applicable for the evaluation of AR HMDs but might not have been applied to the assessment of AR HMDs yet.
Several published reviews have addressed the usability of AR HMDs in surgical applications. A review published in 2004 showed the use of AR in a variety of surgical specialties, with an emphasis on HMDs [as well as heads-up displays (HUDs)] as emerging technology [21]. The most prominent medical specialties in which AR had been applied at that time were neurosurgery, otolaryngology, and maxillofacial surgery. The author acknowledged the limitations associated with nascent AR systems, recognizing that they were a relevant tool only for surgical procedures requiring low-performance surgical dexterity. A literature review of HMDs (as well as HUDs) in surgical applications through 2017 enumerated the specific display devices used in a variety of medical specialties, how they were used, and the medical specialties for which they were used [22]. Special attention was given to use with live human patients. The most common applications of AR in surgery included visualization of surgical microscopes, navigation, monitoring of vital signs, and display of preoperative images. While this review described the broad scope of HMD devices and use cases, an equally thorough treatment of usability assessments has not been presented and would be beneficial for the design of future user studies.
AR usability has been reviewed broadly across diverse applications, including medicine, education, navigation and driving, and tourism and exploration [1]. In this article, which included articles from 2005 to 2014, medicine was the third largest category of published literature, behind perception and interaction. The most common uses of AR in medicine at that time were laparoscopy, exposure therapy for phobia treatment, and physical rehabilitation. Common usability assessments included subjective ratings, error/accuracy measures, and task completion time. In the years since the publications included in this review, there has been a continued surge in AR applications and user studies in medicine, which may have been influenced by technological improvements and the availability of consumer AR systems. As a result, it is now possible to identify trends and analyze gaps in user study designs and assessments.
There have also been review articles specific to the use of AR and other XR devices in individual medical specialties, such as neurosurgery [23,24,25,26], orthopedic surgery [27] and plastic surgery [28], and for various laparoscopy and robotic surgery applications [15,16,17,18,29]. A recent review of optical see-through head-mounted displays in augmented-reality-assisted surgery noted the important role of human factors for the devices’ utility [30].
In this review, our objective was to identify trends in the literature related to the use of AR HMDs and other XR+ devices in open surgery and interventional procedures and the study of their usability. Specifically, we assessed (1) the growth of AR HMD and XR+ applications and the types of devices used for these applications, (2) their relative use in various medical specialties and in pre- vs. intraoperative applications, (3) the reporting of user demographics, and (4) methods for assessing the usability of AR HMDs and other XR+ devices. Usability methods were analyzed to determine which aspects of usability have been addressed, how they have been addressed, and which additional assessments may be informative in future assessments. User studies for surgical planning and procedures were considered regardless of participants, medical specialty, publication year, and number of citations. However, for the reasons stated above, scope-mediated approaches were excluded. Explicit comparisons were not made.

2. Materials and Methods

A systematic literature review was conducted in accordance with the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) method (Figure 1). The PubMed, PubMed Central, EMBASE, and Web of Science databases were searched to find articles pertinent to AR and other XR+ devices in surgical applications. HUDs were referenced in the search terms as these displays are not always reported as XR in the literature and might otherwise be under-represented in our results. The following terms and Boolean logic, modified from the work of Dünser, Grasset [31] and Dey, Billinghurst [1], were applied to the titles, abstracts, and keywords of all available records through September 2022:
(“Augmented reality” OR “Mixed reality” OR “Virtual reality” OR “Augmented virtuality” OR “Stereoscop*” OR “Head$ up Display” OR “3D visualization”) AND (surg*) AND (“user evaluation$” OR “user stud*” OR “survey*” OR “interview*” OR “questionnaire$” OR “pilot stud*” OR “usability” OR “human factors” OR “user experience$” OR “ergonomic*”OR (“participant$” AND “study”) OR (“participant$” AND “studies”) OR (“participant$” AND “experiment$”) OR (“subject$” AND “study”) OR (“subject$” AND “studies”) OR (“subject$” AND “experiment$”)).
The initial search found 4525 records. After the removal of duplicates, 2748 records remained. Another 2533 records were excluded because they were not in English (54 records), did not describe the use of XR+ for surgery or interventional procedures (318), did not provide a complete description of an original research study (691), only focused on surgical training or education (1211), did not include augmentation or replacement of visual information (31), did not include a user study (28), or focused on a scope-mediated procedure (200).
For the remaining 215 records, their respective articles were subject to an in-depth, full-text analysis performed by five of the authors (E.J.B., K.F., A.S.K., K.L.K., and H.L.B.). To start, 10 articles were analyzed by each author, and the results were compared and discussed to arrive at a consistent coding approach. Coding involved extraction of data pertaining to the display hardware used, XR+ visualization type, displayed information, medical specialty, task information, user demographics, usability assessments, and dependent variables. The remaining articles were then divided among the group, coded, and later reviewed by one author (E.J.B.) for consistency. The extracted information was organized to interpret the trends in (1) device type and number of studies per year, (2) distribution of applications by medical specialty and in preoperative vs. intraoperative settings, (3) user demographics related to experience level, sex, and age, and (4) usability and related assessments. Information related to medical specialty grouping and terminology was reviewed by a surgeon (B.B.). During this phase, an additional 68 articles were excluded for the above-mentioned criteria, leaving 147 articles for qualitative data synthesis (Supplemental Table S1). Fifty-three articles described assessments of AR HMD devices.

3. Results

3.1. Device Types

The 147 articles included for data synthesis were published from 1995 to September 2022 (Figure 2). Fifty-three articles described the use of an AR HMD, the first of which was published in 2004.
Several types of hardware have been used for XR+ visualization. Figure 3 shows the number and temporal distribution of publications for AR HMDs and the additional XR+ devices. Since 2017, AR HMDs have been the most prominent XR+ device type. Articles about AR HMDs and other XR+ devices increased until 2020; it is possible that the reduced article count in 2021 could be attributed to the COVID-19 pandemic, which may have reduced both the ability to conduct in-person user studies and hospital research capacity. Because the end date for the search was September 2022, counts for the year 2022 do not represent a full year of articles.

3.2. Surgical Applications

XR+ surgical applications spanned twelve specialties, nine of which included the use of AR HMDs. The most featured specialties were orthopedic and spinal surgery (40 XR+ articles, 16 for AR HMDs), neurosurgery and interventional neuroradiology (26 and 8), and surgical and interventional oncology (23 and 9); see Table 1 for the full list. The breadth of medical specialties demonstrates the versatility and potential uses for AR HMDs and highlights underlying procedural commonalities that motivate their use for open and interventional procedures.
XR+ technologies were used in both pre- and intraoperative settings. Fifty-eight percent of all analyzed articles (85) and 79% of AR HMD articles (42) were used to perform a real or simulated surgical procedure. XR+ devices were used for planning alone in 49 articles (8 with AR HMDs), although some of these applications involved rehearsal for a real surgery using a patient-specific simulation [89,91,154]. Thirteen articles describe surgical planning and use of this planning technology during the procedure, with three of these using AR HMDs. The distribution of all pre- and intraoperative applications over time (Figure 4a) suggests an emphasis on XR+ use during procedures, particularly for AR HMD use (Figure 4b).

3.3. User Demographics

One hundred and nine (109) XR+ articles (83%) specified the total number of subjects in the user studies. The mean number of subjects was 13.6, with a median of 10 and a range from 1 to 77 subjects. Only 24 articles (18%) specified the number of female users, precluding further analysis. For AR HMDs, 35 articles (80%) specified the total number of subjects. The mean and median sample size were 12.9 and 10, with a range of 1 to 62. Eleven (11) articles reported the number of female users (25%).
Across all XR+ articles and the subset of AR HMD articles, user demographic data based on surgical experience revealed a lack of complete reporting, particularly for studies that included expert users (Table 2). For this review, novices were defined as subjects with no reported medical education, trainees were subjects currently engaged in medical training (e.g., medical students, residents, and fellows), and experts were subjects who were reported as attending or expert surgeons or proceduralists. One hundred and four (104) XR+ articles included expert users, but only 15 of these articles (12%) included statistics on the number of female subjects, while 14 articles (14%) included the age of the users. For articles that included novices, 41% reported the number of female users (13), and 52% reported ages of subjects (16). The reporting percentages for articles that included trainees were similar to but higher than those of articles including experts (35% for number of female users and 33% for age statistics).
For AR HMDs, 36 articles included expert users, of which 6 articles (17%) included statistics on the number of female subjects and 9 articles (25%) included the age of the users. In the 12 articles where novice subjects were included, 5 articles reported the number of female users (42%) and 7 reported age (58%). For articles that included trainees, reporting percentages for age and sex were on par with those for novice users (57% for both).

3.4. Usability Assessments

For each article, user assessments and dependent variables from the included user studies were summarized and categorized. These results are summarized in Table 3. The included assessments covered open questions of feasibility, hardware and system characterization, and human factors relating to physiological loads on the user. Seven categories of usability assessments and two additional related categories emerged during analysis, listed in descending order of use in XR+ articles. Usability assessments were:
  • Task Performance (84)—Assessments of motor or visuomotor task success.
  • User Experience (80)—Interviews, surveys, or other user-reported feedback about the usability and effectiveness of visualization type or hardware.
  • Completion Times (55)—Duration of setup or task performance.
  • Cognition (27)—Assessments of mental and attentional demands or changes in decision making.
  • Visual Effects (22)—Objective assessments of visualization quality or accuracy, relative effectiveness among visual augmentations or rendering options, visual perception, or adverse physiological events related to visual perception.
  • Efficiency (17)—Quantification of intraoperative imaging use, material use, or tool path length.
  • Physical loads (1)—Quantification of muscle activity or body movement.
Additional assessments related to usability were:
8.
System Performance (39)—Measurements of visualization hardware and software accuracy and speed.
9.
Validity/Reliability (12)—Comparison of simulators to real situations, or comparisons within and across users or observers.

4. Discussion

4.1. Usability Assessments for XR+ Devices

4.1.1. Task Performance

Task performance was most often assessed using a measurement of error from a landmark in the surgical space or from the surgical plan. Other common metrics included the number of task completions within a time frame, the rate of successful task completions across trials, performance ratings based on standardized evaluation tools or observation by experts, general feasibility of device use, or remarks regarding post-surgical patient outcomes. Interestingly, Scherl, et al. [159] used an AR HMD during a live human surgery and noted that a reversible complication that occurred in 3.7% of procedures without the device did not occur during its use. The Objective Structured Assessment of Technical Skills (OSATS), a validated global rating scale developed to assess surgical skills of trainees [174,175], was also used.

4.1.2. XR+ User Experience

As this review focused on articles describing usability assessments, more than half of the reviewed XR+ articles described the use of user experience (UX) questionnaires to gather feedback on the effectiveness, usefulness, or comfort of using various XR+ devices. The System Usability Scale (SUS) is a widely used questionnaire that was featured in several papers [176]. The vast majority of UX questionnaires and usability assessments captured the experiences of single users working alone. In contrast, Willaert, Aggarwal [91] captured single subject and group dynamics by using the SUS along with the Non-technical skills (NOTECHS) for surgeons rating scale [177] and the Mayo High Performance Teamwork Scale (MHPTS) [178]. Whole surgical teams evaluated virtual surgical simulations for patient-specific rehearsal. The application of these or similar assessments for communication and teamwork during multi-user AR HMD-assisted surgeries would provide valuable insight into the safety and effectiveness of AR HMD use in surgical applications. Indeed, real surgeries are almost exclusively performed in teams, and failures in teamwork are a strong predictor of surgical errors [179]. While UX and user preferences are often captured through subjective measures, Harake, Gnanappa [120] endeavored to link survey data to objective measures. To quantify user preferences between 2D and 3D presentations of echocardiograms, the investigators measured time spent viewing each image type and rotations of the images as a metric of engagement. Beyond capturing user preferences, this type of analysis could be used to understand how users interact with the visualized data and which aspects of the visualized data or user interface are of most value or draw the most attention.

4.1.3. Completion Times

Metrics used for completion time included completion times for system setup, system calibration, co-registration of virtual information to the real environment, and pre- and intraoperative task performance. One of the unique metrics in this category was the time needed for a team to discuss and reach consensus on congenital heart disease diagnoses [121]. This analysis answers the overarching question of potential time reduction but also targets behavioral and cognitive aspects of the clinical process that might be enhanced by novel visualization technologies.

4.1.4. Cognition

To address issues of cognition, some articles measured cognitive load, spatial cognition, attention, and decision making. Cognitive load was assessed through use of the NASA-TLX [6,123] and a surgery-focused variant, the Surg-TLX [50]. The NASA-TLX [180,181] remains a popular assessment tool despite questions regarding the construct validity [182] and interpretability [183,184] of the results. Objective measures may prove favorable alternatives, such as metrics derived from physiological signals [e.g., cortical activity via electroencephalography (EEG) or functional near-infrared spectroscopy (fNIRS) or physiological stress via galvanic skin response (GSR) or heart rate variability (HRV)].
To assess attentional demands, Andersen, Popescu [143] recorded gaze shifts between the novel XR+ visualization and the available standard monitor. Where XR+ devices are purported to increase attention by improving the location of displayed data or providing multiple data streams in the user’s field of view, analyses of gaze shift may be used to assess this claim. Another paper compared the display viewing times for subjects who received task instructions on an AR HMD or standard display terminal [153]. The results present some complexity in the assessment of attention, as subjects completed tasks faster with the monitors but spent a higher percentage of time viewing them and shifted attention from the task to the display more frequently. Furthermore, analyses of attentional focus could benefit from additional information on “blindness” or inattention to quantify the extent to which a novel visualization method is distracting or requires such excessive cognitive demand that overall situational awareness is reduced [185,186]. Spatial cognitive demand may be dependent upon the user’s spatial reasoning abilities. The Mental rotation test [187], used in Pahuta, Schemitsch [64] and Sadri, Kohen [151], or a similar assessment may help determine which users will benefit most from these technologies.
Another dimension of cognition considered in some studies was the impact of visualization on surgical decision making. Kendoff, Citak [61] showed that intraoperative use of a monitor-based AR technology revealed technical errors not visible with standard visualization and prompted immediate revision in 46 of 248 joint reconstruction surgeries. Others describe changes to the surgical intervention or strategy during planning [7,86,88,92,154]. This evidence speaks to potential increases in the safety and effectiveness of surgical interventions through advanced visualization technology.

4.1.5. Visual Effects

Several articles evaluated visual effects, including depth perception, color rendering and perception, visibility of anatomical landmarks, text readability, and grayscale contrast perception. For example, the work of Hansen, Wieferich [145] dealt with rendering techniques to improve understanding of relative depth of blood vessels in the liver. These considerations may be particularly important across applications dealing with vasculature, nerves, or abundant overlapping structures. In Sadri, Kohen [151], the Stereo Fly Test (Stereo Optical Co., Inc., IL, USA) and the Pseudo-Isochromatic Plate (PIP) color vision test [188] were utilized as pre-tests for depth and color perception, respectively. This baseline information was useful to identify any confounding factors affecting the user’s ability to understand and manipulate color-coded 3D virtual heart models displayed via AR HMD. Qian, Barthel [6] demonstrated text readability and contrast perception assessments adapted for AR HMD use. These types of analyses are of particular importance given the transparency of the display, which allows ambient lighting conditions to impact the appearance of virtual content. Contrast perception of grayscale images is also relevant for the display of medical images, such as intraoperative X-ray fluoroscopy used for the orthopedic surgery application presented in the article. Despite the significance of contrast perception and text readability for speed and accuracy of data interpretation, this article presents the only objective, quantitative analysis of these characteristics among the articles included in the present review. Other aspects of image quality and accuracy were addressed by Southworth, Silva [115], such as geometric and rendering distortion and the dynamic range of color and grayscale test patterns under darkened and bright ambient conditions. Additional studies are needed to assess the risks of HMD alignment with the eyes, such as errors in the adjustment of interpupillary distance or subpar placement of the HMD on the head, and consequences for visual perception. Published and anecdotal data suggest that small alignment errors can create significant errors in perceived location of virtual data [189,190].
Prolonged visual loading and certain visual stimuli can cause eye fatigue, dizziness, cyber-sickness, or seizure. Some articles mentioned the potential for these events or gathered feedback on the occurrence of symptoms, but no objective measures of physiological predictors or symptoms were recorded. Potential indicators include blinking, pupil diameter, HRV, gastric activity, GSR, and other physiological stress signals. Additionally, system performance metrics that relate to adverse visual effects could be quantified and assessed for safety. Low frame rates, visual lag, and flicker can cause cyber sickness [191,192,193] as well as errors in task performance [194], and flicker may also cause seizures in susceptible individuals [191].

4.1.6. Efficiency

Efficiency was quantified with quality of movement (e.g., path length when using tools) and minimal use of additional imaging or materials (e.g., X-ray acquisitions or contrast volume used). The enhanced spatial cognition provided by XR+ technologies may lead to reduction in the patient and surgeon’s exposure to radiation, improving the safety of procedures requiring intraoperative imaging.

4.1.7. Physical Loads

Measures of physical load such as EMG and motion capture data allow for objective, quantitative analysis and comparison of ergonomics across visualization types by pinpointing altered muscle activity, movements, or postures that could result in chronic pain or injury. One article quantified physical loading or biomechanics of XR+ device use. Zuo, Jiang [152] collected electromyography (EMG) data to detect fatigue of the sternocleidomastoid muscle, a rotator and flexor of the neck, during AR HMD use.

4.2. Additional Assessments and Reporting Related to Usability

4.2.1. User Demographics

A few articles provided age and sex data, but most articles did not. This incomplete reporting is in stark contrast to the detailed information often provided about the patients in these studies and is particularly troublesome given that trainees as well as experts may be the intended users for these devices. This lack of information limits understanding of the usability results and their generalizability across user groups.
The age and sex of users can influence the safety and effectiveness of XR+ device use. Age-related presbyopia, an inability to focus on close objects, causes blurred vision of close objects. Hardening of the eye lens during natural aging hinders accommodation, or the mechanism of adjusting the eye lens to focus between distant and near images [195,196]. This is especially important to consider for near-eye displays like HMDs [196], for the tasks being performed (surgical tasks are often done within arm’s length) [197], and for the intended users (surgeons are often in their middle age or older) [198].
Sex also plays a role in user XR+ device usability. Various studies have shown that women experience cybersickness more often than men [199,200,201]. Some studies have also shown that females tend to have lower visuospatial reasoning ability [202,203,204], although other studies have shown no differences [205,206,207,208] or differences that are diminished with training [209]. In any case, users with lower spatial reasoning skills may benefit most from the improved spatial understanding that AR can provide.

4.2.2. System Performance

System performance was most often characterized by frame rate, system display lag, co-registration error, and calibration error. A major challenge of AR is the integration of virtual imagery with real objects and environments. Accurate image-to-patient co-registration is needed for surgical navigation. Several papers described novel co-registration methods, such as the use of externally affixed fiducial markers [87], manual image placement through hand gestures [79], automatic co-registration based on selected virtual points [8], or tracking of 3D surface features [137]. After initial co-registration, intraoperative physiological movements and shifting of tissue require continuous monitoring and corrections to ensure accuracy throughout the procedure. Physiological movement, such as breathing and heartbeat, can cause rhythmic co-registration errors [210,211,212]. User movement profiles also introduce varying levels of registration error [152]. Compensation for these movements may require advanced algorithms for motion correction. Another source of co-registration error is intraoperative tissue shifting due to handling or removal of tissue. The use of intraoperative imaging has been used to maintain co-registration during these shifts. Ultrasound methods have been demonstrated for brain shift in neuro-oncology [78], orthopedic surgery at the pelvis [33], and pedicle screw placement [51]. El-Hariri, Pandey [33] reported RMS co-registration errors ranging from 3.22 to 28.30 mm, which they deemed promising but insufficient for surgical applications. These results suggest the need for studies of image co-registration to establish error thresholds for intraoperative use.
Another aspect of system performance considered for AR HMDs was the effectiveness of the user interface under varying conditions. Kubben and Sinlae [72] evaluated the effects of various lighting conditions and colors of the surgeon’s gloves on hand gesture recognition for manipulation of virtual objects. While the authors reported that there were no noticeable differences across conditions for the HoloLens, similar analyses may be informative for other commercially available or custom-built AR HMDs for surgical use.

4.2.3. Validity and Reliability

Metrics of validity and reliability included face validity, content validity, and various metrics of intra- and inter-rater or user variability. For example, Timonen, Iso-Mustajarvi [163] validated the accuracy of a virtual surgical simulation by measuring the distances between anatomical landmarks in a cadaver and in the virtual model and performing Bland–Altman analyses of similarity.

4.3. Limitations

The authors acknowledge that the search terms do not include less popular terms, such as “extended reality” or the specific terms for various hardware types; “augmented reality” and “virtual reality” were expected to be present in any articles that also included synonymous or similar terms. The search terms were also broadened to include surgical applications that did not involve AR HMDs. Uses of AR HMDs are emphasized in the text for clarity.
To focus on articles that included user studies, the articles were required to mention “user(s),” “participants(s),” or other common usability research terms in the abstract, title, or keywords. This stipulation may have excluded otherwise eligible articles from consideration. Further, the term “surg*” was intended to find articles related to surgery. No synonyms were included in the search terms.
Scope-mediated procedures were excluded to focus on open surgery and interventional procedure application spaces. However, usability articles for AR in scope-mediated procedures do include methodologies that could be pertinent for AR HMD use; future work could catalogue these methodologies as well. Lastly, the articles included are all published works. Publication bias may have excluded usability assessment information that would be relevant to this review.
Because no effect sizes were estimated, a protocol has not been archived, and this review was not registered.

5. Conclusions

This systematic literature review identified trends in the use of AR HMDs and related technologies for surgical applications, including the number of publications for AR HMDs and related devices, the distribution of AR HMD and XR+ publications across medical specialties in pre-operative vs. intraoperative applications, the user demographics reported in these user studies, and how usability has been assessed. The results show a growth in the published literature over the past two decades for XR+ devices, with AR HMDs particularly gaining prominence within the past decade. Orthopedic applications were most common, followed by neurosurgery and oncology. Whereas XR+ devices have been growing in use in both pre- and intraoperative settings, AR HMDs have most commonly been featured in intraoperative use. We found a lack of objective usability assessments for physiological loading and underreporting of age and sex user statistics. The perceptual, cognitive, and physical loads imposed by XR+ device use are key components of device usability that are often overlooked, as demonstrated by the lack of articles that address the relevant assessment categories. For this review, perceptual loading assessments were limited to vision and related adverse effects due to the lack of user studies for haptic and auditory augmentations in surgery. While it is known that heightened physiological loads can diminish task performance, cause use errors, and negatively affect the user’s health [213], only 29% of the XR+ articles directly assessed physiological loads.
The contributions of this work include a list of usability assessments, categories of usability considerations to address, and potential assessments to address common methodology gaps. These findings are intended to inform future usability research for AR surgical applications.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/mti7050049/s1, Table S1: Summary of XR+ Devices, Use Cases, and Assessments from Analyzed Articles. File S1: PRISMA_2020_checklist.

Author Contributions

Conceptualization, E.J.B. and H.L.B.; methodology, E.J.B., K.F., B.B., K.L.K. and H.L.B.; validation, E.J.B., K.F., A.S.K., K.L.K. and H.L.B.; formal analysis, E.J.B., K.F., A.S.K., K.L.K. and H.L.B.; resources, E.J.B., A.S.K., K.L.K. and H.L.B.; data curation, E.J.B.; writing—original draft preparation, E.J.B.; writing—review and editing, K.F., B.B., A.S.K., K.L.K. and H.L.B.; visualization, E.J.B.; supervision, H.L.B.; project administration, E.J.B., K.L.K. and H.L.B.; funding acquisition, H.L.B. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Research Participation Program at the Center for Devices and Radiological Health (CDRH) administered by the Oak Ridge Institute for Science and Education through an inter-agency agreement between the U.S. Department of Energy and the U.S. Food and Drug Administration (FDA). Funders and sponsors had no role in the review other than an FDA review to ensure the article did not misrepresent any FDA policy.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are available upon request.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

Note

The mention of commercial products, their sources, or their use in connection with material reported herein is not to be construed as either an actual or implied endorsement of such products by the Department of Health and Human Services.

References

  1. Dey, A.; Billinghurst, M.; Lindeman, R.W.; Swan, J.E. A systematic review of 10 years of augmented reality usability studies: 2005 to 2014. Front. Robot. AI 2018, 5, 37. [Google Scholar] [CrossRef]
  2. Linte, C.A.; Davenport, K.P.; Cleary, K.; Peters, C.; Vosburgh, K.G.; Navab, N.; Jannin, P.; Peters, T.M.; Holmes III, D.R.; Robb, R.A. On mixed reality environments for minimally invasive therapy guidance: Systems architecture, successes and challenges in their implementation from laboratory to clinic. Comput. Med. Imaging Graph. 2013, 37, 83–97. [Google Scholar] [CrossRef] [PubMed]
  3. Friets, E.M.; Strohbehn, J.W.; Hatch, J.F.; Roberts, D.W. A frameless stereotaxic operating microscope for neurosurgery. IEEE Trans. Biomed. Eng. 1989, 36, 608–617. [Google Scholar] [CrossRef] [PubMed]
  4. Kelly, P.J.; Alker, G.J., Jr.; Goerss, S. Computer-assisted stereotactic laser microsurgery for the treatment of intracranial neoplasms. Neurosurgery 1982, 10, 324–331. [Google Scholar] [CrossRef] [PubMed]
  5. Kress, B.; Starner, T. A review of head-mounted displays (HMD) technologies and applications for consumer electronics. In Proceedings of the Conference on SPIE Defense, Security, and Sensing, Baltimore, MD, USA, 29 April–3 May 2013; p. 87200A. [Google Scholar]
  6. Qian, L.; Barthel, A.; Johnson, A.; Osgood, G.; Kazanzides, P.; Navab, N.; Fuerst, B. Comparison of optical see-through head-mounted displays for surgical interventions with object-anchored 2D-display. Int. J. Comput. Assist. Radiol. Surg. 2017, 12, 901–910. [Google Scholar] [CrossRef] [PubMed]
  7. Brun, H.; Bugge, R.A.B.; Suther, L.K.R.; Birkeland, S.; Kumar, R.; Pelanis, E.; Elle, O.J. Mixed reality holograms for heart surgery planning: First user experience in congenital heart disease. Eur. Heart J. Cardiovasc. Imaging 2018, 20, 883–888. [Google Scholar] [CrossRef]
  8. Si, W.X.; Liao, X.; Qian, Y.; Wang, Q. Mixed reality guided radiofrequency needle placement: A pilot study. IEEE Access 2018, 6, 31493–31502. [Google Scholar] [CrossRef]
  9. Deib, G.; Johnson, A.; Unberath, M.; Yu, K.; Andress, S.; Qian, L.; Osgood, G.; Navab, N.; Hui, F.; Gailloud, P. Image guided percutaneous spine procedures using an optical see-through head mounted display: Proof of concept and rationale. J. Neurointerv. Surg. 2018, 10, 1187–1191. [Google Scholar] [CrossRef]
  10. Incekara, F.; Smits, M.; Dirven, C.; Vincent, A. Clinical feasibility of a wearable mixed-reality device in neurosurgery. World Neurosurg. 2018, 118, e422–e427. [Google Scholar] [CrossRef]
  11. Koesveld, J.; Tetteroo, G.; Graaf, E. Use of head-mounted display in transanal endoscopic microsurgery. Surg. Endosc. 2003, 17, 943–946. [Google Scholar] [CrossRef]
  12. Qian, L.; Deguet, A.; Kazanzides, P. ARssist: Augmented reality on a head-mounted display for the first assistant in robotic surgery. Healthc. Technol. Lett. 2018, 5, 194–200. [Google Scholar] [CrossRef] [PubMed]
  13. Liebert, C.A.; Zayed, M.A.; Aalami, O.; Tran, J.; Lau, J.N. Novel use of Google Glass for procedural wireless vital sign monitoring. Surg. Innov. 2016, 23, 366–373. [Google Scholar] [CrossRef] [PubMed]
  14. Liu, D.; Jenkins, S.A.; Sanderson, P.M.; Fabian, P.; Russell, W.J. Monitoring with head-mounted displays in general anesthesia: A clinical evaluation in the operating room. Anesth. Analg. 2010, 110, 1032–1038. [Google Scholar] [CrossRef] [PubMed]
  15. Kumar, S.; Singhal, P.; Krovi, V.N. Computer-vision-based decision support in surgical robotics. IEEE Des. Test 2015, 32, 89–97. [Google Scholar] [CrossRef]
  16. Sakata, S.; Watson, M.O.; Grove, P.M.; Stevenson, A.R. The conflicting evidence of three-dimensional displays in laparoscopy: A review of systems old and new. Ann. Surg. 2016, 263, 234–239. [Google Scholar] [CrossRef]
  17. Bernhardt, S.; Nicolau, S.A.; Soler, L.; Doignon, C. The status of augmented reality in laparoscopic surgery as of 2016. Med. Image Anal. 2017, 37, 66–90. [Google Scholar] [CrossRef]
  18. Luo, X.; Mori, K.; Peters, T.M. Advanced Endoscopic Navigation: Surgical Big Data, Methodology, and Applications. Annu. Rev. Biomed. Eng. 2018, 20, 221–251. [Google Scholar] [CrossRef]
  19. Condino, S.; Carbone, M.; Piazza, R.; Ferrari, M.; Ferrari, V. Perceptual limits of optical see-through visors for augmented reality guidance of manual tasks. IEEE Trans. Biomed. Eng. 2019, 67, 411–419. [Google Scholar] [CrossRef]
  20. Beams, R.; Brown, E.; Cheng, W.C.; Joyner, J.S.; Kim, A.S.; Kontson, K.; Amiras, D.; Baeuerle, T.; Greenleaf, W.; Grossmann, R.J.; et al. Evaluation Challenges for the Application of Extended Reality Devices in Medicine. J. Digit. Imaging 2022, 35, 1409–1418. [Google Scholar] [CrossRef]
  21. Shuhaiber, J.H. Augmented reality in surgery. Arch. Surg. 2004, 139, 170–174. [Google Scholar] [CrossRef]
  22. Yoon, J.W.; Chen, R.E.; Kim, E.J.; Akinduro, O.O.; Kerezoudis, P.; Han, P.K.; Si, P.; Freeman, W.D.; Diaz, R.J.; Komotar, R.J.; et al. Augmented reality for the surgeon: Systematic review. Int. J. Med. Robot. Comput. Assist. Surg. 2018, 14, 13. [Google Scholar] [CrossRef] [PubMed]
  23. Meola, A.; Cutolo, F.; Carbone, M.; Cagnazzo, F.; Ferrari, M.; Ferrari, V. Augmented reality in neurosurgery: A systematic review. Neurosurg. Rev. 2017, 40, 537–548. [Google Scholar] [CrossRef] [PubMed]
  24. Neubauer, A.; Wolfsberger, S. Virtual endoscopy in neurosurgery: A review. Neurosurgery 2013, 72 (Suppl. 1), A97–A106. [Google Scholar] [CrossRef] [PubMed]
  25. de Ribaupierre, S.; Eagleson, S. Editorial: Challenges for the usability of AR and VR for clinical neurosurgical procedures. Healthc. Technol. Lett. 2017, 4, 151. [Google Scholar] [CrossRef]
  26. Guha, D.; Alotaibi, N.M.; Nguyen, N.; Gupta, S.; McFaul, C.; Yang, V.X. Augmented reality in neurosurgery: A review of current concepts and emerging applications. Can. J. Neurol. Sci. 2017, 44, 235–245. [Google Scholar] [CrossRef]
  27. Jud, L.; Fotouhi, J.; Andronic, O.; Aichmair, A.; Osgood, G.; Navab, N.; Farshad, M. Applicability of augmented reality in orthopedic surgery—A systematic review. BMC Musculoskelet. Disord. 2020, 21, 103. [Google Scholar] [CrossRef]
  28. Kim, Y.; Kim, H.; Kim, Y.O. Virtual reality and augmented reality in plastic surgery: A review. Arch. Plast. Surg. 2017, 44, 179. [Google Scholar] [CrossRef]
  29. Qian, L.; Wu, J.Y.; DiMaio, S.P.; Navab, N.; Kazanzides, P. A Review of Augmented Reality in Robotic-Assisted Surgery. IEEE Trans. Med. Robot. Bionics 2019, 2, 1. [Google Scholar] [CrossRef]
  30. Birlo, M.; Edwards, P.E.; Clarkson, M.; Stoyanov, D. Utility of optical see-through head mounted displays in augmented reality-assisted surgery: A systematic review. Med. Image Anal. 2022, 77, 102361. [Google Scholar] [CrossRef]
  31. Dünser, A.; Grasset, R.; Billinghurst, M. A survey of evaluation techniques used in augmented reality studies. In Proceedings of the International Conference on Computer Graphics and Interactive Techniques, Los Angeles, CA, USA, 11–15 August 2008. [Google Scholar]
  32. Cutolo, F.; Carli, S.; Parchi, P.D.; Canalini, L.; Ferrari, M.; Lisanti, M.; Ferrari, V. AR interaction paradigm for closed reduction of long-bone fractures via external fixation. In Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology, Munich, Germany, 2–4 November 2016; Spencer, S.N., Ed.; Assoc Computing Machinery: New York, NY, USA, 2016; pp. 305–306. [Google Scholar]
  33. El-Hariri, H.; Pandey, P.; Hodgson, A.J.; Garbi, R. Augmented reality visualisation for orthopaedic surgical guidance with pre- and intra-operative multimodal image data fusion. Healthc. Technol. Lett. 2018, 5, 189–193. [Google Scholar] [CrossRef]
  34. Fotouhi, J.; Alexander, C.P.; Unberath, M.; Taylor, G.; Lee, S.C.; Fuerst, B.; Johnson, A.; Osgood, G.; Taylor, R.H.; Khanuja, H.; et al. Plan in 2-D, execute in 3-D: An augmented reality solution for cup placement in total hip arthroplasty. J. Med. Imaging 2018, 5, 021205. [Google Scholar] [CrossRef] [PubMed]
  35. Molina, C.A.; Theodore, N.; Ahmed, A.K.; Westbroek, E.M.; Mirovsky, Y.; Harel, R.; Khan, M.; Witham, T.; Sciubba, D.M. Augmented reality-assisted pedicle screw insertion: A cadaveric proof-of-concept study. J. Neurosurg. Spine 2019, 31, 139–146. [Google Scholar] [CrossRef] [PubMed]
  36. Wang, H.X.; Wang, F.; Leong, A.P.Y.; Xu, L.; Chen, X.; Wang, Q. Precision insertion of percutaneous sacroiliac screws using a novel augmented reality-based navigation system: A pilot study. Int. Orthop. 2016, 40, 1941–1947. [Google Scholar] [CrossRef]
  37. Cutolo, F.; Parchi, P.D.; Ferrari, V. Video see through AR head-mounted display for medical procedures. In Proceedings of the 2014 IEEE International Symposium on Mixed and Augmented Reality, Munich, Germany, 10–12 September 2014; Julier, S., Lindeman, R.W., Sandor, C., Eds.; IEEE: New York, NY, USA, 2014; pp. 393–396. [Google Scholar]
  38. Traub, J.; Stefan, P.; Heining, S.M.; Sielhorst, T.; Riquarts, C.; Euler, E.; Navab, N. Stereoscopic augmented reality navigation for trauma surgery: Cadaver experiment and usability study. Int. J. Comput. Assist. Radiol. Surg. 2006, 1, 30–32. [Google Scholar]
  39. Nguyen, N.Q.; Cardinell, J.; Ramjist, J.M.; Androutsos, D.; Yang, V.X. Augmented Reality and Human Factors Regarding the Neurosurgical Operating Room Workflow. In Proceedings of the Conference on Optical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality, Virtual, 28–31 March 2020; Kress, B.C., Peroz, C., Eds.; Spie-Int Soc Optical Engineering: Bellingham, WA, USA, 2020. [Google Scholar]
  40. Urakov, T.M. Augmented Reality-assisted Pedicle Instrumentation: Versatility Across Major Instrumentation Sets. Spine 2020, 45, E1622–E1626. [Google Scholar] [CrossRef] [PubMed]
  41. Viehofer, A.F.; Wirth, S.H.; Zimmermann, S.M.; Jaberg, L.; Dennler, C.; Fürnstahl, P.; Farshad, M. Augmented reality guided osteotomy in hallux Valgus correction. BMC Musculoskelet. Disord. 2020, 21, 1–6. [Google Scholar] [CrossRef]
  42. Bichlmeier, C.; Heining, S.M.; Feuerstein, M.; Navab, N. The virtual mirror: A new interaction paradigm for augmented reality environments. IEEE Trans. Med. Imaging 2009, 28, 1498–1510. [Google Scholar] [CrossRef]
  43. Gu, W.; Martin-Gomez, A.; Cho, S.M.; Osgood, G.; Bracke, B.; Josewski, C.; Knopf, J.; Unberath, M. The impact of visualization paradigms on the detectability of spatial misalignment in mixed reality surgical guidance. Int. J. Comput. Assist. Radiol. Surg. 2022, 17, 921–927. [Google Scholar] [CrossRef]
  44. Harel, R.; Anekstein, Y.; Raichel, M.; Molina, C.A.; Ruiz-Cardozo, M.A.; Orrú, E.; Khan, M.; Mirovsky, Y.; Smorgick, Y. The XVS System During Open Spinal Fixation Procedures in Patients Requiring Pedicle Screw Placement in the Lumbosacral Spine. World Neurosurg. 2022, 164, e1226–e1232. [Google Scholar] [CrossRef]
  45. Yanni, D.S.; Ozgur, B.M.; Louis, R.G.; Shekhtman, Y.; Iyer, R.R.; Boddapati, V.; Iyer, A.; Patel, P.D.; Jani, R.; Cummock, M.; et al. Real-time navigation guidance with intraoperative CT imaging for pedicle screw placement using an augmented reality head-mounted display: A proof-of-concept study. Neurosurg. Focus 2021, 51, E11. [Google Scholar] [CrossRef]
  46. Saylany, A.; Spadola, M.; Blue, R.; Sharma, N.; Ozturk, A.K.; Yoon, J.W. The Use of a Novel Heads-Up Display (HUD) to View Intra-Operative X-rays during a One-Level Cervical Arthroplasty. World Neurosurg. 2020, 138, 369–373. [Google Scholar] [CrossRef] [PubMed]
  47. Yoon, J.W.; Chen, R.E.; Han, P.K.; Si, P.; Freeman, W.D.; Pirris, S.M. Technical feasibility and safety of an intraoperative head-up display device during spine instrumentation. Int. J. Med. Robot. Comput. Assist. Surg. 2017, 13, e1770. [Google Scholar] [CrossRef] [PubMed]
  48. Alexander, C.; Loeb, A.E.; Fotouhi, J.; Navab, N.; Armand, M.; Khanuja, H.S. Augmented Reality for Acetabular Component Placement in Direct Anterior Total Hip Arthroplasty. J. Arthroplast. 2020, 35, 1636–1641.e3. [Google Scholar] [CrossRef]
  49. Bong, J.H.; Kim, H.; Park, S. Development of a surgical navigation system for corrective osteotomy based on augmented reality. Int. J. Precis. Eng. Manuf. 2017, 18, 1057–1062. [Google Scholar] [CrossRef]
  50. Fischer, M.; Fuerst, B.; Lee, S.C.; Fotouhi, J.; Habert, S.; Weidert, S.; Euler, E.; Osgood, G.; Navab, N. Preclinical usability study of multiple augmented reality concepts for K-wire placement. Int. J. Comput. Assist. Radiol. Surg. 2016, 11, 1007–1014. [Google Scholar] [CrossRef]
  51. Ma, L.F.; Zhao, Z.; Chen, F.; Zhang, B.; Fu, L.; Liao, H. Augmented reality surgical navigation with ultrasound-assisted registration for pedicle screw placement: A pilot study. Int. J. Comput. Assist. Radiol. Surg. 2017, 12, 2205–2215. [Google Scholar] [CrossRef]
  52. Ogawa, H.; Hasegawa, S.; Tsukada, S.; Matsubara, M. A pilot study of augmented reality technology applied to the acetabular cup placement during total hip arthroplasty. J. Arthroplast. 2018, 33, 1833–1837. [Google Scholar] [CrossRef]
  53. Ponce, B.A.; Brabston, E.W.; Zu, S.; Watson, S.L.; Baker, D.; Winn, D.; Guthrie, B.L.; Shenai, M.B. Telemedicine with mobile devices and augmented reality for early postoperative care. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Orlando, FL, USA, 16–20 August 2016; pp. 4411–4414. [Google Scholar]
  54. Tsukada, S.; Ogawa, H.; Nishino, M.; Kurosaka, K.; Hirasawa, N. Augmented reality-based navigation system applied to tibial bone resection in total knee arthroplasty. J. Exp. Orthop. 2019, 6, 44. [Google Scholar] [CrossRef]
  55. Wang, X.; Habert, S.; Zu Berge, C.S.; Fallavollita, P.; Navab, N. Inverse visualization concept for RGB-D augmented C-arms. Comput. Biol. Med. 2016, 77, 135–147. [Google Scholar] [CrossRef]
  56. Chen, X.; Naik, H.; Wang, L.; Navab, N.; Fallavollita, P. Video-guided calibration of an augmented reality mobile C-arm. Int. J. Comput. Assist. Radiol. Surg. 2014, 9, 987–996. [Google Scholar] [CrossRef]
  57. Erat, O.; Pauly, O.; Weidert, S.; Thaller, P.; Euler, E.; Mutschler, W.; Navab, N.; Fallavollita, P. How a surgeon becomes Superman by visualization of intelligently fused multi-modalities. In Proceedings of the Medical Imaging 2013: Image-Guided Procedures, Robotic Interventions, and Modeling, Orlando, FL, USA, 9–14 February 2013; Holmes, D.R., Yaniv, Z.R., Eds.; SPIE-Int Soc Optical Engineering: Bellingham, WA, USA, 2013. [Google Scholar]
  58. Gavaghan, K.; Oliveira-Santos, T.; Peterhans, M.; Reyes, M.; Kim, H.; Anderegg, S.; Weber, S. Evaluation of a portable image overlay projector for the visualisation of surgical navigation data: Phantom studies. Int. J. Comput. Assist. Radiol. Surg. 2012, 7, 547–556. [Google Scholar] [CrossRef] [PubMed]
  59. Heining, S.M.; Wiesner, S.; Euler, E.; Navab, N. Pedicle screw placement under video-augmented flouroscopic control: First clinical application in a cadaver study. Int. J. Comput. Assist. Radiol. Surg. 2006, 1, 189–190. [Google Scholar]
  60. Juhnke, B.; Berron, M.; Philip, A.; Williams, J.; Holub, J.; Winer, E. Comparing the Microsoft (R) Kinect (TM) to a traditional mouse for adjusting the viewed tissue densities of three-dimensional anatomical structures. In Proceedings of the Medical Imaging 2013: Image Perception, Observer Performance, and Technology Assessment, Lake Buena Vista, FL, USA, 28 March 2013; Abbey, C.K., MelloThoms, C.R., Eds.; SPIE-Int Soc Optical Engineering: Bellingham, WA, USA, 2013. [Google Scholar]
  61. Kendoff, D.; Citak, M.; Gardner, M.J.; Stübig, T.; Krettek, C.; Hüfner, T. Intraoperative 3D Imaging: Value and Consequences in 248 Cases. J. Trauma Inj. Infect. Crit. Care 2009, 66, 232–238. [Google Scholar] [CrossRef] [PubMed]
  62. Londei, R.; Esposito, M.; Diotte, B.; Weidert, S.; Euler, E.; Thaller, P.; Navab, N.; Fallavollita, P. Intra-operative augmented reality in distal locking. Int. J. Comput. Assist. Radiol. Surg. 2015, 10, 1395–1403. [Google Scholar] [CrossRef] [PubMed]
  63. Marschollek, M.; Teistler, M.; Bott, O.J.; Stuermer, K.M.; Pretschner, D.P.; Dresing, K. Pre-operative dynamic interactive exploration of complex articular fractures using a novel 3D navigation tool. Methods Inf. Med. 2006, 45, 384–388. [Google Scholar]
  64. Pahuta, M.A.; Schemitsch, E.H.; Backstein, D.; Papp, S.; Gofton, W. Virtual fracture carving improves understanding of a complex fracture: A randomized controlled study. J. Bone Jt. Surg. Am. Vol. 2012, 94, e182.1–e182.7. [Google Scholar] [CrossRef]
  65. Testi, D.; Lattanzi, R.; Benvegnù, M.; Petrone, M.; Zannoni, C.; Viceconti, M.; Toni, A. Efficacy of stereoscopic visualization and six degrees of freedom interaction in preoperative planning of total hip replacement. Med. Inform. Internet Med. 2006, 31, 205–218. [Google Scholar] [CrossRef]
  66. Vaghela, K.R.; Lee, J.; Akhtar, K. Performance on a virtual reality DHS simulator correlates with performance in the operating theatre. Surg. Technol. Int. 2018, 33, sti33/1040. [Google Scholar]
  67. Alsofy, S.Z.; Nakamura, M.; Ewelt, C.; Kafchitsas, K.; Lewitz, M.; Schipmann, S.; Molina, E.S.; Santacroce, A.; Stroop, R. Retrospective Comparison of Minimally Invasive and Open Monosegmental Lumbar Fusion, and Impact of Virtual Reality on Surgical Planning and Strategy. J. Neurol. Surg. Part A Cent. Eur. Neurosurg. 2021, 82, 399–409. [Google Scholar]
  68. Moreta-Martinez, R.; Pose-Díez-de-la-Lastra, A.; Calvo-Haro, J.A.; Mediavilla-Santos, L.; Pérez-Mañanes, R.; Pascau, J. Combining Augmented Reality and 3D Printing to Improve Surgical Workflows in Orthopedic Oncology: Smartphone Application and Clinical Evaluation. Sensors 2021, 21, 1370. [Google Scholar] [CrossRef]
  69. Pandey, P.U.; Guy, P.; Lefaivre, K.A.; Hodgson, A.J. What are the optimal targeting visualizations for performing surgical navigation of iliosacral screws? A user study. Trauma Surg. 2021, 143, 677–690. [Google Scholar] [CrossRef] [PubMed]
  70. Cartucho, J.; Shapira, D.; Ashrafian, H.; Giannarou, S. Multimodal mixed reality visualisation for intraoperative surgical guidance. Int. J. Comput. Assist. Radiol. Surg. 2020, 15, 819–826. [Google Scholar] [CrossRef] [PubMed]
  71. Cutolo, F.; Meola, A.; Carbone, M.; Sinceri, S.; Cagnazzo, F.; Denaro, E.; Esposito, N.; Ferrari, M.; Ferrari, V. A new head-mounted display-based augmented reality system in neurosurgical oncology: A study on phantom. Comput. Assist. Surg. 2017, 22, 39–53. [Google Scholar] [CrossRef] [PubMed]
  72. Kubben, P.; Sinlae, R.N. Feasibility of using a low-cost head-mounted augmented reality device in the operating room. Surg. Neurol. Int. 2019, 10, 26. [Google Scholar] [CrossRef]
  73. Fick, T.; van Doormaal, J.A.M.; Hoving, E.W.; Regli, L.; van Doormaal, T.P.C. Holographic patient tracking after bed movement for augmented reality neuronavigation using a head-mounted display. Acta Neurochir. 2021, 163, 879–884. [Google Scholar] [CrossRef] [PubMed]
  74. Benmahdjoub, M.; Niessen, W.J.; Wolvius, E.B.; Walsum, T.V. Multimodal markers for technology-independent integration of augmented reality devices and surgical navigation systems. Virtual Real. 2022, 26, 1637–1650. [Google Scholar] [CrossRef]
  75. Condino, S.; Montemurro, N.; Cattari, N.; D’Amato, R.; Thomale, U.; Ferrari, V.; Cutolo, F. Evaluation of a Wearable AR Platform for Guiding Complex Craniotomies in Neurosurgery. Ann. Biomed. Eng. 2021, 49, 2590–2605. [Google Scholar] [CrossRef]
  76. Thabit, A.; Benmahdjoub, M.; Van Veelen, M.L.C.; Niessen, W.J.; Wolvius, E.B.; van Walsum, T. Augmented reality navigation for minimally invasive craniosynostosis surgery: A phantom study. Int. J. Comput. Assist. Radiol. Surg. 2022, 17, 1453–1460. [Google Scholar] [CrossRef]
  77. Coelho, G.; Rabelo, N.N.; Vieira, E.; Mendes, K.; Zagatto, G.; de Oliveira, R.S.; Raposo-Amaral, C.E.; Yoshida, M.; de Souza, M.R.; Fagundes, C.F.; et al. Augmented reality and physical hybrid model simulation for preoperative planning of metopic craniosynostosis surgery. Neurosurg. Focus 2020, 48, E19. [Google Scholar] [CrossRef]
  78. Gerard, I.J.; Kersten-Oertel, M.; Drouin, S.; Hall, J.A.; Petrecca, K.; De Nigris, D.; Di Giovanni, D.A.; Arbel, T.; Collins, D.L. Combining intraoperative ultrasound brain shift correction and augmented reality visualizations: A pilot study of eight cases. J. Med. Imaging 2018, 5, 021210. [Google Scholar] [CrossRef]
  79. Léger, É.; Reyes, J.; Drouin, S.; Collins, D.L.; Popa, T.; Kersten-Oertel, M. Gesture-based registration correction using a mobile augmented reality image-guided neurosurgery system. Healthc. Technol. Lett. 2018, 5, 137–142. [Google Scholar] [CrossRef] [PubMed]
  80. Léger, É.; Reyes, J.; Drouin, S.; Popa, T.; Hall, J.A.; Collins, D.L.; Kersten-Oertel, M. MARIN: An open-source mobile augmented reality interactive neuronavigation system. Int. J. Comput. Assist. Radiol. Surg. 2020, 15, 1013–1021. [Google Scholar] [CrossRef] [PubMed]
  81. Acker, G.; Schlinkmann, N.; Piper, S.K.; Onken, J.; Vajkoczy, P.; Picht, T. Stereoscopic versus monoscopic viewing of aneurysms: Experience of a single institution with a novel stereoscopic viewing system. World Neurosurg. 2018, 119, E491–E501. [Google Scholar] [CrossRef] [PubMed]
  82. Azar, F.S.; Perrin, N.; Khamene, A.; Vogt, S.; Sauer, F. User performance analysis of different image-based navigation systems for needle placement procedures. In Proceedings of the Medical Imaging 2004: Visualization, Image-Guided Procedures, and Display, San Diego, CA, USA, 15–17 February 2004; Galloway, R.L., Ed.; SPIE-Int Soc Optical Engineering: Bellingham, WA, USA, 2004; pp. 110–121. [Google Scholar]
  83. Gökyar, A.; Bahadır, S.; Çokluk, C. Evaluation of projection-based augmented reality technique in cerebral catheter procedures. Ann. Clin. Anal. Med. 2020, 11, 630–633. [Google Scholar]
  84. Kersten-Oertel, M.; Chen, S.J.-S.; Collins, D.L. An Evaluation of Depth Enhancing Perceptual Cues for Vascular Volume Visualization in Neurosurgery. IEEE Trans. Vis. Comput. Graph. 2014, 20, 391–403. [Google Scholar] [CrossRef]
  85. Rolls, A.E.; Riga, C.V.; Bicknell, C.D.; Stoyanov, D.V.; Shah, C.V.; Van Herzeele, I.; Hamady, M.; Cheshire, N.J. A pilot study of video-motion analysis in endovascular surgery: Development of real-time discriminatory skill metrics. Eur. J. Vasc. Endovasc. Surg. 2013, 45, 509–515. [Google Scholar] [CrossRef]
  86. Stadie, A.T.; Kockro, R.A. Mono-stereo-autostereo: The evolution of 3-dimensional neurosurgical planning. Neurosurgery 2013, 72, A63–A77. [Google Scholar] [CrossRef]
  87. Tabrizi, L.B.; Mahvash, M. Augmented reality-guided neurosurgery: Accuracy and intraoperative application of an image projection technique. J. Neurosurg. 2015, 123, 206–211. [Google Scholar] [CrossRef]
  88. Tai, A.X.; Riga, C.V.; Bicknell, C.D.; Stoyanov, D.V.; Shah, C.V.; Van Herzeele, I.; Hamady, M.; Cheshire, N.J. The Benefits of Limited Orbitotomy on the Supraorbital Approach: An Anatomic and Morphometric Study in Virtual Reality. Neurosurgery 2020, 18, 542–550. [Google Scholar] [CrossRef]
  89. Willaert, W.; Aggarwal, R.; Bicknell, C.; Hamady, M.; Darzi, A.; Vermassen, F.; Cheshire, N.; European Virtual Reality Endovascular Research Team. Patient-specific simulation in carotid artery stenting. J. Vasc. Surg. 2010, 52, 1700–1705. [Google Scholar] [CrossRef]
  90. Willaert, W.I.M.; Aggarwal, R.; Daruwalla, F.; Van Herzeele, I.; Darzi, A.W.; Vermassen, F.E.; Cheshire, N.J.; European Virtual Reality Endovascular Research Team EVEResT. Simulated procedure rehearsal is more effective than a preoperative generic warm-up for endovascular procedures. Ann. Surg. 2012, 255, 1184–1189. [Google Scholar] [CrossRef] [PubMed]
  91. Willaert, W.I.M.; Aggarwal, R.; Van Herzeele, I.; Plessers, M.; Stroobant, N.; Nestel, D.; Cheshire, N.; Vermassen, F. Role of patient-specific virtual reality rehearsal in carotid artery stenting. Br. J. Surg. 2012, 99, 1304–1313. [Google Scholar] [CrossRef]
  92. Zawy Alsofy, S.; Sakellaropoulou, I.; Stroop, R. Evaluation of Surgical Approaches for Tumor Resection in the Deep Infratentorial Region and Impact of Virtual Reality Technique for the Surgical Planning and Strategy. J. Craniofac. Surg. 2020, 31, 1865–1869. [Google Scholar] [CrossRef] [PubMed]
  93. Koike, T.; Kin, T.; Tanaka, Y.; Uchikawa, H.; Shiode, T.; Saito, N. Development of Innovative Neurosurgical Operation Support Method Using Mixed-Reality Computer Graphics. World Neurosurg. X 2021, 11, 100102. [Google Scholar] [CrossRef] [PubMed]
  94. Huang, L.; Collins, S.; Kobayashi, L.; Merck, D.; Sgouros, T. Shared visualizations and guided procedure simulation in augmented reality with Microsoft HoloLens. In Proceedings of the Medical Imaging 2019: Image-Guided Procedures, Robotic Interventions, and Modeling, San Diego, CA, USA, 16–21 February 2019; Fei, B., Linte, C.A., Eds.; SPIE-Int Soc Optical Engineering: Bellingham, WA, USA, 2019. [Google Scholar]
  95. Perkins, S.L.; Lin, M.A.; Srinivasan, S.; Wheeler, A.J.; Hargreaves, B.A.; Daniel, B.L. A mixed-reality system for breast surgical planning. In Proceedings of the 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), Nantes, France, 9–13 October 2017; IEEE: Nantes, France, 2017; pp. 269–274. [Google Scholar]
  96. Amini, S.; Kersten-Oertel, M. Augmented reality mastectomy surgical planning prototype using the HoloLens template for healthcare technology letters. Healthc. Technol. Lett. 2019, 6, 261–265. [Google Scholar] [CrossRef]
  97. Galati, R.; Simone, M.; Barile, G.; De Luca, R.; Cartanese, C.; Grassi, G. Experimental Setup Employed in the Operating Room Based on Virtual and Mixed Reality: Analysis of Pros and Cons in Open Abdomen Surgery. J. Health Eng. 2020, 2020, 8851964. [Google Scholar] [CrossRef]
  98. Werkgartner, G.; Lemmerer, M.; Hauser, H.; Sorantin, E.; Beichel, R.; Reitinger, B.; Bornik, A.; Leberl, F.; Mischinger, H.J. Augmented-reality-based liver-surgical planning system. Eur. Surg. 2004, 36, 270–274. [Google Scholar] [CrossRef]
  99. Wacker, F.K.; Vogt, S.; Khamene, A.; Sauer, F.; Wendt, M.; Duerk, J.L.; Lewin, J.S.; Wolf, K.J. MR image-guided needle biopsies with a combination of augmented reality and MRI: A pilot study in phantoms and animals. In Proceedings of the CARS 2005: Computer Assisted Radiology and Surgery, Berlin, Germany, 22–25 June 2005; Elsevier Science Bv: Amsterdam, The Netherlands, 2005; pp. 424–428. [Google Scholar]
  100. Heinrich, F.; Schwenderling, L.; Joeres, F.; Lawonn, K.; Hansen, C. Comparison of Augmented Reality Display Techniques to Support Medical Needle Insertion. IEEE Trans. Vis. Comput. Graph. 2020, 26, 3568–3575. [Google Scholar] [CrossRef]
  101. Cattari, N.; Condino, S.; Cutolo, F.; Ferrari, M.; Ferrari, V. In Situ Visualization for 3D Ultrasound-Guided Interventions with Augmented Reality Headset. Bioengineering 2021, 8, 131. [Google Scholar] [CrossRef]
  102. Cardin, M.A.; Wang, J.X.; Lobaugh, N.J.; Guimont, I.; Plewes, D.B. A quantitative evaluation of human coordination interfaces for computer assisted surgery. Comput. Aided Surg. 2007, 12, 71–81. [Google Scholar] [CrossRef]
  103. El-Khameesy, N.; El-Wishy, H. Evaluating the impact of virtual reality on liver surgical planning procedures. In Modeling and Simulation in Engineering, Economics, and Management; Engemann, K.J., GilLafuente, A.M., Merigo, J.M., Eds.; Springer: Berlin, Germany, 2012; pp. 210–218. [Google Scholar]
  104. Freschi, C.; Troia, E.; Ferrari, V.; Megali, G.; Pietrabissa, A.; Mosca, F. Ultrasound guided robotic biopsy using augmented reality and human-robot cooperative control. In Proceedings of the 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Minneapolis, MN, USA, 3–6 September 2009; IEEE: New York, NY, USA, 2009; Volume 1–20, pp. 5110–5113. [Google Scholar]
  105. Pfeiffer, M.; Kenngott, H.; Preukschas, A.; Huber, M.; Bettscheider, L.; Müller-Stich, B.; Speidel, S. IMHOTEP: Virtual reality framework for surgical applications. Int. J. Comput. Assist. Radiol. Surg. 2018, 13, 741–748. [Google Scholar] [CrossRef] [PubMed]
  106. Reitinger, B.; Bornik, A.; Beichel, R.; Schmalstieg, D. Liver surgery planning using virtual reality. IEEE Comput. Graph. Appl. 2006, 26, 36–47. [Google Scholar] [CrossRef] [PubMed]
  107. Tian, F.; Wu, J.X.; Rong, W.Q.; Wang, L.M.; Wu, F.; Yu, W.B.; An, S.L.; Liu, F.Q.; Feng, L.; Bi, C.; et al. Three-dimensional morphometric analysis for hepatectomy of centrally located hepatocellular carcinoma: A pilot study. World J. Gastroenterol. 2015, 21, 4607–4619. [Google Scholar] [CrossRef] [PubMed]
  108. Vos, E.L.; Koning, A.H.; Obdeijn, I.M.; van Verschuer, V.M.; Verhoef, C.; van der Spek, P.J.; Menke-Pluijmers, M.B.; Koppert, L.B. Preoperative prediction of cosmetic results in breast conserving surgery. J. Surg. Oncol. 2014, 111, 178–184. [Google Scholar] [CrossRef] [PubMed]
  109. Wellens, L.M.; Meulstee, J.; van de Ven, C.P.; Van Scheltinga, C.T.; Littooij, A.S.; van den Heuvel-Eibrink, M.M.; Fiocco, M.; Rios, A.C.; Maal, T.; Wijnen, M.H. Comparison of 3-Dimensional and Augmented Reality Kidney Models with Conventional Imaging Data in the Preoperative Assessment of Children with Wilms Tumors. JAMA Netw. Open 2019, 2, e192633. [Google Scholar] [CrossRef]
  110. Asgar-Deen, D.; Carriere, J.; Wiebe, E.; Peiris, L.; Duha, A.; Tavakoli, M. Augmented Reality Guided Needle Biopsy of Soft Tissue: A Pilot Study. Front. Robot. AI 2020, 7, 72. [Google Scholar] [CrossRef]
  111. Davrieux, C.F.; Carriere, J.; Wiebe, E.; Peiris, L.; Duha, A.; Tavakoli, M. Mixed reality navigation system for ultrasound-guided percutaneous punctures: A pre-clinical evaluation. Surg. Endosc. 2020, 34, 226–230. [Google Scholar] [CrossRef]
  112. Gao, Y.; Zhao, Y.; Xie, L.; Zheng, G. A Projector-Based Augmented Reality Navigation System for Computer-Assisted Surgery. Sensors 2021, 21, 2931. [Google Scholar] [CrossRef]
  113. Huettl, F.; Saalfeld, P.; Hansen, C.; Preim, B.; Poplawski, A.; Kneist, W.; Lang, H.; Huber, T. Virtual reality and 3D printing improve preoperative visualization of 3D liver reconstructions-results from a preclinical comparison of presentation modalities and user’s preference. Ann. Transl. Med. 2021, 9, 1074. [Google Scholar] [CrossRef]
  114. Kumar, R.P.; Pelanis, E.; Bugge, R.; Brun, H.; Palomar, R.; Aghayan, D.L.; Fretland, Å.A.; Edwin, B.; Elle, O.J. Use of mixed reality for surgery planning: Assessment and development workflow. J. Biomed. Inform. 2020, 112, 100077. [Google Scholar] [CrossRef]
  115. Southworth, M.K.; Silva, J.N.A.; Blume, W.M.; Van Hare, G.F.; Dalal, A.S.; Silva, J.R. Performance Evaluation of Mixed Reality Display for Guidance during Transcatheter Cardiac Mapping and Ablation. IEEE J. Transl. Eng. Health Med. 2020, 8, 1–10. [Google Scholar] [CrossRef] [PubMed]
  116. Ye, W.; Zhang, X.; Li, T.; Luo, C.; Yang, L. Mixed-reality hologram for diagnosis and surgical planning of double outlet of the right ventricle: A pilot study. Clin. Radiol. 2020, 76, 237.e1–237.e7. [Google Scholar] [CrossRef] [PubMed]
  117. Ballocca, F.; Meier, L.M.; Ladha, K.; Hiansen, J.Q.; Horlick, E.M.; Meineri, M. Validation of quantitative 3-dimensional transesophageal echocardiography mitral valve analysis using stereoscopic display. J. Cardiothorac. Vasc. Anesth. 2018, 33, 732–741. [Google Scholar] [CrossRef]
  118. Bruckheimer, E.; Rotschild, C.; Dagan, T.; Amir, G.; Kaufman, A.; Gelman, S.; Birk, E. Computer-generated real-time digital holography: First time use in clinical medical imaging. Eur. Hearth J. Cardiovasc. Imaging 2016, 17, 845–849. [Google Scholar] [CrossRef]
  119. Guiraudon, G.M.; Jones, D.L.; Bainbridge, D.; Linte, C.; Pace, D.; Moore, J.; Wedlake, C.; Lang, P.; Peters, T.M. Augmented reality image guidance during off-pump mitral valve replacement through the guiraudon universal cardiac introducer. Innov. Technol. Tech. Cardiothorac. Vasc. Surg. 2010, 5, 430–438. [Google Scholar] [CrossRef]
  120. Harake, D.; Gnanappa, G.K.; Alvarez, S.G.; Whittle, A.; Punithakumar, K.; Boechler, P.; Noga, M.; Khoo, N.S. Stereoscopic Display Is Superior to Conventional Display for Three-Dimensional Echocardiography of Congenital Heart Anatomy. J. Am. Soc. Echocardiogr. 2020, 33, 1297–1305. [Google Scholar] [CrossRef]
  121. Kim, B.; Loke, Y.H.; Mass, P.; Irwin, M.R.; Capeland, C.; Olivieri, L.; Krieger, A. A Novel Virtual Reality Medical Image Display System for Group Discussions of Congenital Heart Disease: Development and Usability Testing. JMIR Cardio 2020, 4, e20633. [Google Scholar] [CrossRef]
  122. Kozlowski, P.; Urheim, S.; Samset, E. Evaluation of a multi-view autostereoscopic real-time 3D ultrasound system for minimally invasive cardiac surgery guidance. In Proceedings of the 2017 IEEE 14th International Symposium on Biomedical Imaging, Melbourne, Australia, 18–21 April 2017; IEEE: New York, NY, USA, 2017; pp. 604–607. [Google Scholar]
  123. Lo, J.; Moore, J.; Wedlake, C.; Guiraudon, G.; Eagleson, R.; Peters, T. Surgeon-controlled visualization techniques for virtual reality-guided cardiac surgery. Stud. Health Technol. Inform. 2009, 142, 162–167. [Google Scholar]
  124. Napa, S.; Moore, M.; Bardyn, T. Advancing Cardiac Surgery Case Planning and Case Review Conferences Using Virtual Reality in Medical Libraries: Evaluation of the Usability of Two Virtual Reality Apps. JMIR Hum. Factors 2019, 6, e12008. [Google Scholar] [CrossRef]
  125. Seitel, M.; Maier-Hein, L.; Rietdorf, U.; Nikoloff, S.; Seitel, A.; Franz, A.; Kenngott, H.; Karck, M.; De Simone, R.; Wolf, I.; et al. Towards a mixed reality environment for preoperative planning of cardiac surgery. Stud. Health Technol. Inform. 2009, 142, 307–309. [Google Scholar]
  126. Pushparajah, K.; Chu, K.Y.K.; Deng, S.; Wheeler, G.; Gomez, A.; Kabir, S.; Schnabel, J.A.; Simpson, J.M. Virtual reality three-dimensional echocardiographic imaging for planning surgical atrioventricular valve repair. JTCVS Tech. 2021, 7, 269–277. [Google Scholar] [CrossRef] [PubMed]
  127. Kim, B.; Nguyen, P.; Loke, Y.H.; Cleveland, V.; Liu, X.; Mass, P.; Hibino, N.; Olivieri, L.; Krieger, A. Virtual Reality Cardiac Surgical Planning Software (CorFix) for Designing Patient-Specific Vascular Grafts: Development and Pilot Usability Study. JMIR Cardio 2022, 6, e35488. [Google Scholar] [CrossRef] [PubMed]
  128. Pellegrino, G.; Mangano, C.; Mangano, R.; Ferri, A.; Taraschi, V.; Marchetti, C. Augmented reality for dental implantology: A pilot clinical report of two cases. BMC Oral Health 2019, 19, 158. [Google Scholar] [CrossRef] [PubMed]
  129. Pepe, A.; Trotta, G.F.; Mohr-Ziak, P.; Gsaxner, C.; Wallner, J.; Bevilacqua, V.; Egger, J. A Marker-Less Registration Approach for Mixed Reality-Aided Maxillofacial Surgery: A Pilot Evaluation. J. Digit. Imaging 2019, 32, 1008–1018. [Google Scholar] [CrossRef] [PubMed]
  130. Katic, D.; Spengler, P.; Bodenstedt, S.; Castrillon-Oberndorfer, G.; Seeberger, R.; Hoffmann, J.; Dillmann, R.; Speidel, S. A system for context-aware intraoperative augmented reality in dental implant surgery. Int. J. Comput. Assist. Radiol. Surg. 2015, 10, 101–108. [Google Scholar] [CrossRef]
  131. Glas, H.H.; Kraeima, J.; van Ooijen, P.M.A.; Spijkervet, F.K.L.; Yu, L.; Witjes, M.J.H. Augmented Reality Visualization for Image-Guided Surgery: A Validation Study Using a Three-Dimensional Printed Phantom. J. Oral Maxillofac. Surg. 2021, 79, 1943.e1–1943.e10. [Google Scholar] [CrossRef]
  132. Kim, H.; Jeong, S.; Seo, J.; Park, I.; Ko, H.; Moon, S.Y. Augmented reality for botulinum toxin injection. Concurr. Comput. Pract. Exp. 2019, 32, e5526. [Google Scholar] [CrossRef]
  133. Al-Saud, L.M.; Mushtaq, F.; Mirghani, I.A.; Balkhoyor, A.; Keeling, A.; Manogue, M.; Mon-Williams, M.A. Drilling into the functional significance of stereopsis: The impact of stereoscopic information on surgical performance. Ophthalmic Physiol. Opt. 2017, 37, 498–506. [Google Scholar] [CrossRef]
  134. Bartella, A.K.; Kamal, M.; Scholl, I.; Schiffer, S.; Steegmann, J.; Ketelsen, D.; Hölzle, F.; Lethaus, B. Virtual reality in preoperative imaging in maxillofacial surgery: Implementation of “the next level”? Br. J. Oral Maxillofac. Surg. 2019, 57, 644–648. [Google Scholar] [CrossRef]
  135. Enislidis, G.; Wagner, A.; Ploder, O.; Ewers, R. Computed intraoperative navigation guidance—A preliminary report on a new technique. Br. J. Oral Maxillofac. Surg. 1997, 35, 271–274. [Google Scholar] [CrossRef]
  136. Stamm, T.; Meyer, U.; Meier, N.; Ehmer, U.; Joos, U. Public domain computer-aided surgery (CAS) in orthodontic and maxillofacial surgery. J. Orofac. Orthop. 2002, 63, 62–75. [Google Scholar] [CrossRef] [PubMed]
  137. Suenaga, H.; Tran, H.H.; Liao, H.; Masamune, K.; Dohi, T.; Hoshi, K.; Takato, T. Vision-based markerless registration using stereo vision and an augmented reality surgical navigation system: A pilot study. BMC Med. Imaging 2015, 15, 11. [Google Scholar] [CrossRef] [PubMed]
  138. Suenaga, H.; Hoang Tran, H.; Liao, H.; Masamune, K.; Dohi, T.; Hoshi, K.; Mori, Y.; Takato, T. Real-time in situ three-dimensional integral videography and surgical navigation using augmented reality: A pilot study. Int. J. Oral Sci. 2013, 5, 98–102. [Google Scholar] [CrossRef]
  139. Rojas-Munoz, E.; Cabrera, M.E.; Andersen, D.; Popescu, V.; Marley, S.; Mullis, B.; Zarzaur, B.; Wachs, J. Surgical Telementoring Without Encumbrance A Comparative Study of See-through Augmented Reality-based Approaches. Ann. Surg. 2019, 270, 384–389. [Google Scholar] [CrossRef]
  140. Qian, L.; Deguet, A.; Wang, Z.; Liu, Y.H.; Kazanzides, P. Augmented Reality Assisted Instrument Insertion and Tool Manipulation for the First Assistant in Robotic Surgery. In Proceedings of the 2019 International Conference on Robotics and Automation, Montreal, QC, Canada, 20–24 May 2019; Howard, A., Ed.; IEEE: New York, NY, USA, 2019; pp. 5173–5179. [Google Scholar]
  141. Moosburner, S.; Remde, C.; Tang, P.; Queisner, M.; Haep, N.; Pratschke, J.; Sauer, I.M. Real world usability analysis of two augmented reality headsets in visceral surgery. Artif. Organs 2018, 43, 694–698. [Google Scholar] [CrossRef] [PubMed]
  142. Arpaia, P.; De Benedetto, E.; De Paolis, L.; D’Errico, G.; Donato, N.; Duraccio, L. Performance and Usability Evaluation of an Extended Reality Platform to Monitor Patient’s Health during Surgical Procedures. Sensors 2022, 22, 3908. [Google Scholar] [CrossRef]
  143. Andersen, D.; Popescu, V.; Cabrera, M.E.; Shanghavi, A.; Mullis, B.; Marley, S.; Gomez, G.; Wachs, J.P. An augmented reality-based approach for surgical telementoring in austere environments. Mil. Med. 2017, 182, 310–315. [Google Scholar] [CrossRef]
  144. Boedecker, C.; Huettl, F.; Saalfeld, P.; Paschold, M.; Kneist, W.; Baumgart, J.; Preim, B.; Hansen, C.; Lang, H.; Huber, T. Using virtual 3D-models in surgical planning: Workflow of an immersive virtual reality application in liver surgery. Langenbeck’s Arch. Surg. 2021, 406, 911–915. [Google Scholar] [CrossRef]
  145. Hansen, C.; Wieferich, J.; Ritter, F.; Rieder, C.; Peitgen, H.O. Illustrative visualization of 3D planning models for augmented reality in liver surgery. Int. J. Comput. Assist. Radiol. Surg. 2010, 5, 133–141. [Google Scholar] [CrossRef]
  146. Sampogna, G.; Pugliese, R.; Elli, M.; Vanzulli, A.; Forgione, A. Routine clinical application of virtual reality in abdominal surgery. Minim. Invasive Ther. Allied Technol. 2017, 26, 135–143. [Google Scholar] [CrossRef]
  147. Vertemati, M.; Cassin, S.; Rizzetto, F.; Vanzulli, A.; Elli, M.; Sampogna, G.; Gallieni, M. A Virtual Reality Environment to Visualize Three-Dimensional Patient-Specific Models by a Mobile Head-Mounted Display. Surg. Innov. 2019, 26, 359–370. [Google Scholar] [CrossRef] [PubMed]
  148. Allgaier, M.; Chheang, V.; Saalfeld, P.; Apilla, V.; Huber, T.; Huettl, F.; Neyazi, B.; Sandalcioglu, I.E.; Hansen, C.; Preim, B.; et al. A comparison of input devices for precise interaction tasks in VR-based surgical planning and training. Comput. Biol. Med. 2022, 145, 105429. [Google Scholar] [CrossRef]
  149. Hombeck, J.; Meuschke, M.; Zyla, L.; Heuser, A.J.; Toader, J.; Popp, F.; Bruns, C.J.; Hansen, C.; Datta, R.R.; Lawonn, K. Evaluating Perceptional Tasks for Medicine: A Comparative User Study between a Virtual Reality and a Desktop Application. In Proceedings of the IEEE Conference on Virtual Reality and 3D User Interfaces (IEEE VR), Christchurch, New Zealand, 12–16 March 2022. [Google Scholar]
  150. Wu, X.; Zeng, N.; Hu, H.; Pan, M.; Jia, F.; Wen, S.; Tian, J.; Yang, J.; Fang, C. Preliminary Exploration on the Efficacy of Augmented Reality-Guided Hepatectomy for Hepatolithiasis. J. Am. Coll. Surg. 2022, 235, 677–688. [Google Scholar] [CrossRef] [PubMed]
  151. Sadri, S.; Kohen, S.A.; Elvezio, C.; Sun, S.H.; Grinshpoon, A.; Loeb, G.J.; Basu, N.; Feiner, S.K. Manipulating 3D Anatomic Models in Augmented Reality: Comparing a Hands-Free Approach and a Manual Approach. In Proceedings of the 2019 IEEE International Symposium on Mixed and Augmented Reality, Beijing, China, 10–18 October 2019; IEEE: New York, NY, USA, 2019; pp. 93–102. [Google Scholar]
  152. Zuo, Y.; Jiang, T.; Dou, J.; Yu, D.; Ndaro, Z.N.; Du, Y.; Li, Q.; Wang, S.; Huang, G. A Novel Evaluation Model for a Mixed-Reality Surgical Navigation System: Where Microsoft HoloLens Meets the Operating Room. Surg. Innov. 2020, 27, 193–202. [Google Scholar] [CrossRef] [PubMed]
  153. Bartlett, C.; LeRoy, N.; Schofield, D.; Ford, J.; Decker, S. Assessing the Feasibility of using Augmented Reality to Visualize Interventional Radiology Imagery. In Proceedings of the Ivapp: 15th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, Valetta, Malta, 27–29 February 2020; Kerren, A., Hurter, C., Braz, J., Eds.; Scitepress: Setubal, Portugal; Volume 3, pp. 169–176. [Google Scholar]
  154. Desender, L.; Rancic, Z.; Aggarwal, R.; Duchateau, J.; Glenck, M.; Lachat, M.; Vermassen, F.; Van Herzeele, I.; EVEREST (European Virtual Reality Endovascular RESearch Team). Patient-specific Rehearsal Prior to EVAR: A Pilot Study. Eur. J. Vasc. Endovasc. Surg. 2013, 45, 639–647. [Google Scholar] [CrossRef]
  155. Furtado, H.; Stüdeli, T.; Sette, M.; Morita, T.; Trunk, P.; Freudenthal, A.; Samset, E.; Bergsland, J.; Geršak, B. Endoclamp Balloon Visualization and Automatic Placement System. Hearth Surg. Forum 2010, 13, E205–E211. [Google Scholar] [CrossRef]
  156. Guo, S.; Cai, X.; Gao, B. A tensor-mass method-based vascular model and its performance evaluation for interventional surgery virtual reality simulator. Int. J. Med. Robot. Comput. Assist. Surg. 2018, 14, e1946. [Google Scholar] [CrossRef]
  157. Wang, J.; Fallavollita, P.; Wang, L.; Kreiser, M.; Navab, N. Augmented reality during angiography: Integration of a virtual mirror for improved 2D/3D visualization. In Proceedings of the 2012 IEEE International Symposium on Mixed and Augmented Reality, Atlanta, GA, USA, 5–8 November 2012; IEEE: New York, NY, USA, 2012; pp. 257–264. [Google Scholar]
  158. Scherl, C.; Stratemeier, J.; Karle, C.; Rotter, N.; Hesser, J.; Huber, L.; Dias, A.; Hoffmann, O.; Riffel, P.; Schoenberg, S.O.; et al. Augmented reality with HoloLens in parotid surgery: How to assess and to improve accuracy. Eur. Arch. Oto-Rhino-Laryngol. 2020, 278, 2473–2483. [Google Scholar] [CrossRef]
  159. Scherl, C.; Stratemeier, J.; Rotter, N.; Hesser, J.; Schönberg, S.O.; Servais, J.J.; Männle, D.; Lammert, A. Augmented Reality with HoloLens (R) in Parotid Tumor Surgery: A Prospective Feasibility Study. Orl-J. Oto-Rhino-Laryngol. Head Neck Surg. 2021, 10, 439–448. [Google Scholar] [CrossRef]
  160. Gsaxner, C.; Pepe, A.; Li, J.; Ibrahimpasic, U.; Wallner, J.; Schmalstieg, D.; Egger, J. Augmented Reality for Head and Neck Carcinoma Imaging: Description and Feasibility of an Instant Calibration, Markerless Approach. Comput. Methods Programs Biomed. 2021, 200, 8. [Google Scholar] [CrossRef]
  161. Andersen, S.A.W.; Varadarajan, V.V.; Moberly, A.C.; Hittle, B.; Powell, K.A.; Wiet, G.J. Patient-specific Virtual Temporal Bone Simulation Based on Clinical Cone-beam Computed Tomography. Laryngoscope 2021, 131, 1855–1862. [Google Scholar] [CrossRef] [PubMed]
  162. Hans, P.; Grant, A.J.; Laitt, R.D.; Ramsden, R.T.; Kassner, A.; Jackson, A. Comparison of three-dimensional visualization techniques for depicting the scala vestibuli and scala tympani of the cochlea by using high-resolution MR imaging. Am. J. Neuroradiol. 1999, 20, 1197–1206. [Google Scholar] [PubMed]
  163. Timonen, T.; Iso-Mustajärvi, M.; Linder, P.; Lehtimäki, A.; Löppönen, H.; Elomaa, A.P.; Dietz, A. Virtual reality improves the accuracy of simulated preoperative planning in temporal bones: A feasibility and validation study. Eur. Arch. Oto-Rhino-Laryngol. 2020, 278, 2795–2806. [Google Scholar] [CrossRef] [PubMed]
  164. Ungar, O.J.; Handzel, O.; Haviv, L.; Dadia, S.; Cavel, O.; Fliss, D.M.; Oron, Y. Optimal Head Position Following Intratympanic Injections of Steroids, As Determined by Virtual Reality. Otolaryngol. Neck Surg. 2019, 161, 1012–1017. [Google Scholar] [CrossRef] [PubMed]
  165. Hendel, K.; Ortner, V.K.; Fuchs, C.S.; Eckhouse, V.; Haedersdal, M. Dermatologic Scar Assessment with Stereoscopic Imaging and Digital Three-Dimensional Models: A Validation Study. Lasers Surg. Med. 2021, 7, 1043–1049. [Google Scholar] [CrossRef]
  166. Jinnin, M.; Fukushima, S.; Masuguchi, S.; Tanaka, H.; Kawashita, Y.; Ishihara, T.; Ihn, H. Evaluation of usefulness of 3D views for clinical photography. Biosci. Trends 2011, 5, 211–216. [Google Scholar] [CrossRef]
  167. van den Berg, N.S.; Engelen, T.; Brouwer, O.R.; Mathéron, H.M.; Valdés-Olmos, R.A.; Nieweg, O.E.; van Leeuwen, F.W. A pilot study of SPECT/CT-based mixed-reality navigation towards the sentinel node in patients with melanoma or Merkel cell carcinoma of a lower extremity. Nucl. Med. Commun. 2016, 37, 812–817. [Google Scholar] [CrossRef]
  168. Rojas-Muñoz, E.; Lin, C.; Sanchez-Tamayo, N.; Cabrera, M.E.; Andersen, D.; Popescu, V.; Barragan, J.A.; Zarzaur, B.; Murphy, P.; Anderson, K.; et al. Evaluation of an augmented reality platform for austere surgical telementoring: A randomized controlled crossover study in cricothyroidotomies. Npj Digit. Med. 2020, 3, 75. [Google Scholar] [CrossRef]
  169. Rojas-Munoz, E.; Cabrera, M.E.; Lin, C.; Sánchez-Tamayo, N.; Andersen, D.; Popescu, V.; Anderson, K.; Zarzaur, B.; Mullis, B.; Wachs, J.P. Telementoring in Leg Fasciotomies via Mixed-Reality: Clinical Evaluation of the STAR Platform. Mil. Med. 2020, 185, 513–520. [Google Scholar] [CrossRef]
  170. Andersen, D.S.; Cabrera, M.E.; Rojas-Muñoz, E.J.; Popescu, V.S.; Gonzalez, G.T.; Mullis, B.; Marley, S.; Zarzaur, B.L.; Wachs, J.P. Augmented Reality Future Step Visualization for Robust Surgical Telementoring. Simul. Healthc. J. Soc. Simul. Healthc. 2019, 14, 59–66. [Google Scholar] [CrossRef]
  171. Block, F.E.; Yablok, D.O.; McDonald, J.S. Clinical-evaluation of the head-up display of anesthesia data—Preliminary communication. Int. J. Clin. Monit. Comput. 1995, 12, 21–24. [Google Scholar] [CrossRef] [PubMed]
  172. Jeon, Y.; Choi, S.; Kim, H. Evaluation of a simplified augmented reality device for ultrasound-guided vascular access in a vascular phantom. J. Clin. Anesth. 2014, 26, 485–489. [Google Scholar] [CrossRef] [PubMed]
  173. Novotny, J.; Miller, W.R.; Luks, F.I.; Merck, D.; Collins, S.; Laidlaw, D.H. Towards Placental Surface Vasculature Exploration in Virtual Reality. IEEE Comput. Graph. Appl. 2020, 40, 28–39. [Google Scholar] [CrossRef]
  174. Niitsu, H.; Hirabayashi, N.; Yoshimitsu, M.; Mimura, T.; Taomoto, J.; Sugiyama, Y.; Murakami, S.; Saeki, S.; Mukaida, H.; Takiyama, W. Using the Objective Structured Assessment of Technical Skills (OSATS) global rating scale to evaluate the skills of surgical trainees in the operating room. Surg. Today 2013, 43, 271–275. [Google Scholar] [CrossRef]
  175. Martin, J.; Regehr, G.; Reznick, R.; Macrae, H.; Murnaghan, J.; Hutchison, C.; Brown, M. Objective structured assessment of technical skill (OSATS) for surgical residents. Br. J. Surg. 1997, 84, 273–278. [Google Scholar] [PubMed]
  176. Brooke, J. SUS: A “quick and dirty’usability. Usability Eval. Ind. 1996, 189, 4–7. [Google Scholar]
  177. Sevdalis, N.; Davis, R.; Koutantji, M.; Undre, S.; Darzi, A.; Vincent, C.A. Reliability of a revised NOTECHS scale for use in surgical teams. Am. J. Surg. 2008, 196, 184–190. [Google Scholar] [CrossRef] [PubMed]
  178. Malec, J.F.; Torsher, L.C.; Dunn, W.F.; Wiegmann, D.A.; Arnold, J.J.; Brown, D.A.; Phatak, V. The mayo high performance teamwork scale: Reliability and validity for evaluating key crew resource management skills. Simul. Healthc. 2007, 2, 4–10. [Google Scholar] [CrossRef]
  179. Wiegmann, D.A.; ElBardissi, A.W.; Dearani, J.A.; Daly, R.C.; Sundt III, T.M. Disruptions in surgical flow and their relationship to surgical errors: An exploratory investigation. Surgery 2007, 142, 658–665. [Google Scholar] [CrossRef]
  180. Hart, S.G. NASA-task load index (NASA-TLX); 20 years later. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2006, 50, 904–908. [Google Scholar] [CrossRef]
  181. Hart, S.G.; Staveland, L.E. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. In Advances in Psychology; North-Holland: Amsterdam, The Netherlands, 1988; Volume 52, pp. 139–183. [Google Scholar]
  182. McKendrick, R.D.; Cherry, E. A deeper look at the NASA TLX and where it falls short. Proc. Hum. Factors Ergon. Soc. Annu. Meet 2018, 62, 44–48. [Google Scholar] [CrossRef]
  183. Byers, J.C.; Bittner, A.C.; Hill, S.G. Traditional and raw task load index (TLX) correlations: Are paired comparisons necessary? In Advances in Industrial Ergonomics and Safety; Mital, A., Ed.; CRC Press: Cincinnati, OH, USA, 1989; pp. 481–485. [Google Scholar]
  184. Miyake, S. Factors influencing mental workload indexes. J. Univ. Occup. Environ. Health 1997, 19, 313–325. [Google Scholar] [CrossRef]
  185. Hughes-Hallett, A.; Mayer, E.K.; Marcus, H.J.; Pratt, P.; Mason, S.; Darzi, A.W.; Vale, J.A. Inattention blindness in surgery. Surg. Endosc. 2015, 29, 3184–3189. [Google Scholar] [CrossRef] [PubMed]
  186. Dixon, B.J.; Daly, M.J.; Chan, H.; Vescan, A.D.; Witterick, I.J.; Irish, J.C. Surgeons blinded by enhanced navigation: The effect of augmented reality on attention. Surg. Endosc. 2013, 27, 454–461. [Google Scholar] [CrossRef] [PubMed]
  187. Vandenberg, S.G.; Kuse, A.R. Mental Rotations, a Group Test of Three-Dimensional Spatial Visualization. Percept. Mot. Ski. 1978, 47, 599–604. [Google Scholar] [CrossRef] [PubMed]
  188. Dain, S.J. Clinical colour vision tests. Clin. Exp. Optom. 2004, 87, 276–293. [Google Scholar] [CrossRef]
  189. Tsurutani, K.; Naruse, K.; Oshima, K.; Uehara, S.; Sato, Y.; Inoguchi, K.; Otsuka, K.; Wakemoto, H.; Kurashige, M.; Sato, O.; et al. 65-2: Optical attachment to measure both eye-box/FOV characteristics for AR/VR eyewear displays. In Proceedings of the SID Symposium Digest of Technical Papers, Los Angeles, CA, USA, 21–26 May 2017; pp. 954–957. [Google Scholar]
  190. Draper, R.S.; Penczek, J.; Varshneya, R.; Boynton, P.A. 72-2: Standardizing fundamental criteria for near eye display optical measurements: Determining eye point position. In Proceedings of the SID Symposium Digest of Technical Papers, Los Angeles, CA, USA, 20–25 May 2018; pp. 961–964. [Google Scholar]
  191. Nichols, S.; Patel, H. Health and safety implications of virtual reality: A review of empirical evidence. Appl. Ergon. 2002, 33, 251–271. [Google Scholar] [CrossRef]
  192. LaViola, J.J., Jr. A discussion of cybersickness in virtual environments. Spec. Interest Group Comput. Hum. Interact. Bull. 2000, 32, 47–56. [Google Scholar] [CrossRef]
  193. DiZio, P.; Lackner, J.R. Circumventing Side Effects of Immersive Virtual Environments; HCI International: San Francisco, CA, USA, 1997; pp. 893–896. [Google Scholar]
  194. Sielhorst, T.; Bichlmeier, C.; Heining, S.M.; Navab, N. Depth perception–a major issue in medical AR: Evaluation study by twenty surgeons. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Copenhagen, Denmark, 1–6 October 2006; Springer: Berlin, Germany, 2006; pp. 364–372. [Google Scholar]
  195. Duane, A. Normal values of the accommodation at all ages. J. Am. Med. Assoc. 1912, 59, 1010–1013. [Google Scholar] [CrossRef]
  196. Padmanaban, N.; Konrad, R.; Stramer, T.; Cooper, E.A.; Wetzstein, G. Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays. Proc. Natl. Acad. Sci. USA 2017, 114, 2183–2188. [Google Scholar] [CrossRef]
  197. Condino, S.; Turini, G.; Parchi, P.D.; Viglialoro, R.M.; Piolanti, N.; Gesi, M.; Ferrari, M.; Ferrari, V. How to Build a Patient-Specific Hybrid Simulator for Orthopaedic Open Surgery: Benefits and Limits of Mixed-Reality Using the Microsoft HoloLens. J. Health Eng. 2018, 2018, 5435097. [Google Scholar] [CrossRef] [PubMed]
  198. Lee, H.J.; Drag, L.L.; Bieliauskas, L.A.; Langenecker, S.A.; Graver, C.; O’Neill, J.; Greenfield, L. Results from the cognitive changes and retirement among senior surgeons self-report survey. J. Am. Coll. Surg. 2009, 209, 668–671.e2. [Google Scholar] [CrossRef] [PubMed]
  199. Garcia, A.; Baldwin, C.; Dworsky, M. Gender differences in simulator sickness in fixed-versus rotating-base driving simulator. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2010, 54, 1551–1555. [Google Scholar] [CrossRef]
  200. Jaeger, B.K.; Mourant, R.R. Comparison of simulator sickness using static and dynamic walking simulators. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2001, 45, 1896–1900. [Google Scholar] [CrossRef]
  201. Stanney, K.M.; Hale, K.S.; Nahmens, I.; Kennedy, R.S. What to expect from immersive virtual environment exposure: Influences of gender, body mass index, and past experience. Hum. Factors 2003, 45, 504–520. [Google Scholar] [CrossRef]
  202. Linn, M.C.; Petersen, A.C. Emergence and characterization of sex differences in spatial ability: A meta-analysis. Child Dev. 1985, 56, 1479–1498. [Google Scholar] [CrossRef]
  203. Voyer, D.; Saunders, K.A. Gender differences on the mental rotations test: A factor analysis. Acta Psychol. 2004, 117, 79–94. [Google Scholar] [CrossRef]
  204. Voyer, D.; Voyer, S.; Bryden, M.P. Magnitude of sex differences in spatial abilities: A meta-analysis and consideration of critical variables. Psychol. Bull. 1995, 117, 250. [Google Scholar] [CrossRef]
  205. McWilliams, W.; Hamilton, C.J.; Muncer, S.J. On Mental Rotation in Three Dimensions. Percept. Mot. Ski. 1997, 85, 297–298. [Google Scholar] [CrossRef]
  206. Larson, P.; Rizzo, A.A.; Buckwalter, J.G.; van Rooyen, A.; Kratz, K.; Neumann, U.; Kesselman, C.; Thiébaux, M.; van der Zaag, C. Gender Issues in the Use of Virtual Environments. CyberPsychol. Behav. 1999, 2, 113–123. [Google Scholar] [CrossRef]
  207. Parsons, T.D.; Larson, P.; Kratz, K.; Thiebaux, M.; Bluestein, B.; Buckwalter, J.G.; Rizzo, A.A. Sex differences in mental rotation and spatial rotation in a virtual environment. Neuropsychologia 2004, 42, 555–562. [Google Scholar] [CrossRef] [PubMed]
  208. Rodán, A.; Contreras, M.J.; Elosúa, M.R.; Gimeno, P. Experimental but Not Sex Differences of a Mental Rotation Training Program on Adolescents. Front. Psychol. 2016, 7, 1050. [Google Scholar] [CrossRef] [PubMed]
  209. Neubauer, A.C.; Bergner, S.; Schatz, M. Two- vs. three-dimensional presentation of mental rotation tasks: Sex differences and effects of training on performance and brain activation. Intelligence 2010, 38, 529–539. [Google Scholar] [CrossRef] [PubMed]
  210. Soler, L.; Nicolau, S.; Schmid, J.; Koehl, C.; Marescaux, J.; Pennec, X.; Ayache, N. Virtual reality and augmented reality in digestive surgery. In Proceedings of the Third IEEE and ACM International Symposium on Mixed and Augmented Reality, Arlington, VA, USA, 5 November 2004; pp. 278–279. [Google Scholar]
  211. Shekhar, R.; Dandekar, O.; Bhat, V.; Philip, M.; Lei, P.; Godinez, C.; Sutton, E.; George, I.; Kavic, S.; Mezrich, R.; et al. Live augmented reality: A new visualization method for laparoscopic surgery using continuous volumetric computed tomography. Surg. Endosc. 2010, 24, 1976–1985. [Google Scholar] [CrossRef] [PubMed]
  212. Nicolau, S.; Pennec, X.; Soler, L.; Buy, X.; Gangi, A.; Ayache, N.; Marescaux, J. An augmented reality system for liver thermal ablation: Design and evaluation on clinical cases. Med. Image Anal. 2009, 13, 494–506. [Google Scholar] [CrossRef]
  213. FDA. Applying Human Factors and Usability Engineering to Medical Devices; FDA: Silver Spring, MD, USA, 2016.
Figure 1. PRISMA flowchart for systematic literature review.
Figure 1. PRISMA flowchart for systematic literature review.
Mti 07 00049 g001
Figure 2. Time distribution of XR+ usability articles.
Figure 2. Time distribution of XR+ usability articles.
Mti 07 00049 g002
Figure 3. Time distribution of XR+ usability articles by hardware type.
Figure 3. Time distribution of XR+ usability articles by hardware type.
Mti 07 00049 g003
Figure 4. Time distribution of (a) XR+ and (b) AR HMD usability articles for surgical planning vs. procedures.
Figure 4. Time distribution of (a) XR+ and (b) AR HMD usability articles for surgical planning vs. procedures.
Mti 07 00049 g004
Table 1. Analyzed articles listed by medical specialty.
Table 1. Analyzed articles listed by medical specialty.
Medical SpecialtyPublications for AR HMDs and Other XR+ Devices
Orthopedic/
Spinal Surgery (40)
AR HMD (16): [6,9,32,33,34,35,36,37,38,39,40,41,42,43,44,45]
Other (24): [46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69]
Neurosurgery/
Interventional Neuroradiology (26)
AR HMD (8): [10,70,71,72,73,74,75,76]
Other (18): [53,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93]
Surgical Oncology/Interventional Oncology (23)AR HMD (9): [8,94,95,96,97,98,99,100,101]
Other (14): [58,68,102,103,104,105,106,107,108,109,110,111,112,113]
Cardiac Surgery/Interventional Cardiology (16)AR HMD (4): [7,114,115,116]
Other (12): [105,117,118,119,120,121,122,123,124,125,126,127]
Oral and Maxillofacial Surgery (13)AR HMD (4): [128,129,130,131]
Other (9): [58,77,132,133,134,135,136,137,138]
General Surgery (12)AR HMD (4): [139,140,141,142]
Other (8): [143,144,145,146,147,148,149,150]
Endovascular Surgery (7)AR HMD (3): [151,152,153]
Other (4): [154,155,156,157]
Otolaryngology (7)AR HMD (3): [158,159,160]
Other (4): [161,162,163,164]
Dermatology/Plastic Surgery (3)AR HMD: --
Other (3): [165,166,167]
Emergency Medicine/Trauma (3)AR HMD (2): [168,169]
Other (1): [170]
Anesthesiology (2)AR HMD: --
Other (2): [171,172]
Obstetrics (1)AR HMD: --
Other (1): [173]
Table 2. Experience levels and reported demographic data of users.
Table 2. Experience levels and reported demographic data of users.
Experience-Level Subset of Included Users
Article SubsetNoviceTraineeExpertNot Reported
All XR+ Articles314810417
Includes Number of Users3048888
Includes Number of Female Users1317150
Includes Age Statistics1616180
All AR HMD Articles1214366
Includes Number of Users1114302
Includes Number of Female Users5860
Includes Age Statistics7890
Table 3. Usability assessments from analyzed articles.
Table 3. Usability assessments from analyzed articles.
CategoryAssessments
Task Performance
(84 XR+ Articles)
Error of position or orientation; Number of task completions; Success rate; Complication Rate; Performance rating; Feasibility; Objective Structured Assessment of Technical Skills (OSATS); Surgical Outcomes.
User Experience (80)Feedback on effectiveness, usefulness, quality, ergonomics, or visual effects; User preferences for system tools or visualization technique; User interaction/engagement with the visualization; Tool, material, or procedure choices compared with standard tools; System Usability Scale (SUS); Non-technical skills (NOTECHS) for surgeons; Mayo High Performance Teamwork Scale (MHPTS).
Time Management (55)Task completion time; Diagnosis/Planning time; Setup/Co-registration time.
System Performance (39)Frame rate; System display lag; Co-registration error; Calibration error; Effects of lighting and glove color on hand gesture controls; Effects of user movement on visualization accuracy.
Visual Effects (22)Effects of shading/color/transparency on depth perception; Identification of anatomical/surgical landmarks; Text readability; Contrast perception; Stereo Fly Test (SFT); Pseudo-Isochromatic Plate (PIP) color vision test; Color and greyscale perception under varying ambient lighting conditions; Geometric and rendering distortions; Visual discomfort.
Efficiency (17)Tool path length/deviations; X-ray acquisitions; Contrast volume used; Cineloops used.
Cognition (27)Attention shifts; Cognitive load (NASA-TLX, Surg-TLX); Task recall; Mental rotation test; Influence on diagnosis, surgical approach, or revisions.
Validity/Reliability (12)Face validity; Content validity; Interrater variability for anatomical landmark detection; Intra- and inter-operator variability of measurements; Inter-observer variability of performance assessment; Statistical similarity of virtual and real landmarks.
Physical Load (1)Muscle fatigue; Posture.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Brown, E.J.; Fujimoto, K.; Blumenkopf, B.; Kim, A.S.; Kontson, K.L.; Benz, H.L. Usability Assessments for Augmented Reality Head-Mounted Displays in Open Surgery and Interventional Procedures: A Systematic Review. Multimodal Technol. Interact. 2023, 7, 49. https://doi.org/10.3390/mti7050049

AMA Style

Brown EJ, Fujimoto K, Blumenkopf B, Kim AS, Kontson KL, Benz HL. Usability Assessments for Augmented Reality Head-Mounted Displays in Open Surgery and Interventional Procedures: A Systematic Review. Multimodal Technologies and Interaction. 2023; 7(5):49. https://doi.org/10.3390/mti7050049

Chicago/Turabian Style

Brown, Ellenor J., Kyoko Fujimoto, Bennett Blumenkopf, Andrea S. Kim, Kimberly L. Kontson, and Heather L. Benz. 2023. "Usability Assessments for Augmented Reality Head-Mounted Displays in Open Surgery and Interventional Procedures: A Systematic Review" Multimodal Technologies and Interaction 7, no. 5: 49. https://doi.org/10.3390/mti7050049

APA Style

Brown, E. J., Fujimoto, K., Blumenkopf, B., Kim, A. S., Kontson, K. L., & Benz, H. L. (2023). Usability Assessments for Augmented Reality Head-Mounted Displays in Open Surgery and Interventional Procedures: A Systematic Review. Multimodal Technologies and Interaction, 7(5), 49. https://doi.org/10.3390/mti7050049

Article Metrics

Back to TopTop