Nothing Special   »   [go: up one dir, main page]

Next Article in Journal
The Effect of Layout and Colour Temperature on the Perception of Tourism Websites for Mobile Devices
Previous Article in Journal
Sleeping Soundlessly in the Intensive Care Unit
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Data-Driven Activities Involving Electronic Health Records: An Activity and Task Analysis Framework for Interactive Visualization Tools

1
Insight Lab, Western University, London, ON N6A 3K7, Canada
2
Room 420, Department of Computer Science, Middlesex College, Western University, London, ON N6A 3K7, Canada
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2020, 4(1), 7; https://doi.org/10.3390/mti4010007
Submission received: 19 December 2019 / Revised: 21 January 2020 / Accepted: 27 February 2020 / Published: 3 March 2020
Figure 1
<p>Relationships among activities, tasks, and interactions. Top-down view: activity is made up of sub-activities, tasks, sub-tasks, and interactions. Bottom-up view: activity emerges over time, through performance of tasks and interactions. Visualizations are depicted as Vis and reactions as <math display="inline"><semantics> <mrow> <msub> <mi mathvariant="normal">R</mi> <mi mathvariant="normal">x</mi> </msub> </mrow> </semantics></math>. Source: adapted from [<a href="#B7-mti-04-00007" class="html-bibr">7</a>].</p> ">
Figure 2
<p>Overview of the proposed activity and task analysis framework. The visual tasks are represented as blue and interactive tasks are represented as yellow.</p> ">
Figure 3
<p>Search results and how we selected the 24 articles that described 19 IVTs.</p> ">
Figure 4
<p>Lifelines2: Interactive visualization tool for temporal categorical data. Source: Image courtesy of the University of Maryland Human–Computer Interaction Lab, <a href="http://hcil.umd.edu" target="_blank">http://hcil.umd.edu</a>.</p> ">
Figure 5
<p>Lifeflow: Interactive visualization tool that provides an overview of event sequences. Source: Image courtesy of the University of Maryland Human–Computer Interaction Lab, <a href="http://hcil.umd.edu" target="_blank">http://hcil.umd.edu</a>.</p> ">
Figure 6
<p>Eventflow: Interactive visualization tool for analysis of event sequences for both point-based and interval events. Source: image courtesy of the University of Maryland Human–Computer Interaction Lab, <a href="http://hcil.umd.edu" target="_blank">http://hcil.umd.edu</a>.</p> ">
Figure 7
<p>Caregiver: Interactive visualization tool for visualization of categorical and numerical data. Source: <span class="html-italic">Image courtesy of Dominique Brodbeck.</span></p> ">
Figure 8
<p>CoCo: Interactive visualization tool for comparing cohorts of event sequences. Source: image courtesy of the University of Maryland Human–Computer Interaction Lab, <a href="http://hcil.umd.edu" target="_blank">http://hcil.umd.edu</a>.</p> ">
Figure 9
<p>Similan: interactive visualization tool for the exploration of similar records in the temporal categorical data. Source: image courtesy of the University of Maryland Human–Computer Interaction Lab, <a href="http://hcil.umd.edu" target="_blank">http://hcil.umd.edu</a>.</p> ">
Figure 10
<p>IPBC: 3D visualization tool for analysis of numerical data from multiple hemodialysis sessions. Source: reprinted from Journal of Visual Languages &amp; Computing, 14, Chittaro L, Combi C, Trapasso G, <span class="html-italic">Data mining on temporal data: a visual approach and its clinical application to hemodialysis</span>, 591-620, Copyright (2003), with permission from Elsevier.</p> ">
Figure 11
<p>TimeRider: Interactive visualization tool for pattern recognition in patient cohort data. Source: reprinted by permission from Springer Nature: Springer, Ergonomics and Health Aspects of Work with Computers, <span class="html-italic">Visually Exploring Multivariate Trends in Patient Cohorts Using Animated Scatter Plots</span>, Rind A, Aigner W, Miksch S, et al., copyright (2011).</p> ">
Figure 12
<p>VISITORS: Interactive visualization tool for the exploration of multiple patient records. (<b>A</b>) displays lists of patients. (<b>B</b>) displays a list of time intervals. (<b>C</b>) displays the data for a group of 58 patients over the current time interval. Panel 1 shows the white blood cell raw counts for the patients, while Panels 2 and 3 display the states of monthly distribution of platelet and haemoglobin in higher abstraction, respectively. Abstractions are encoded in medical ontologies displayed in panels (<b>D</b>). Source: reprinted from Journal of Artificial Intelligence in Medicine, 49, Klimov D, Shahar Y, Taieb-Maimon M, <span class="html-italic">Intelligent visualization and exploration of time-oriented data of multiple patients</span>, 11-31., copyright (2010), with permission from Elsevier.</p> ">
Figure 13
<p>MIVA: Interactive visualization tool to show the temporal change of numerical values where each variable is represented by an individual point plot. Source: image courtesy of Antony Faiola.</p> ">
Figure 14
<p>Lifelines: interactive visualization tool that displays patient’s medical histories on a timeline. Source: image courtesy of the University of Maryland Human–Computer Interaction Lab, <a href="http://hcil.umd.edu" target="_blank">http://hcil.umd.edu</a>.</p> ">
Figure 15
<p>VisuExplore: interactive visualization tool that displays patient data in various views on a timeline. Source: reprinted by permission from Springer Nature: Springer, Human–Computer Interaction, <span class="html-italic">Patient Development at a Glance: An Evaluation of a Medical Data Visualization</span>, Pohl M, Wiltner S, Rind A, et al., copyright (2011).</p> ">
Versions Notes

Abstract

:
Electronic health records (EHRs) can be used to make critical decisions, to study the effects of treatments, and to detect hidden patterns in patient histories. In this paper, we present a framework to identify and analyze EHR-data-driven tasks and activities in the context of interactive visualization tools (IVTs)—that is, all the activities, sub-activities, tasks, and sub-tasks that are and can be supported by EHR-based IVTs. A systematic literature survey was conducted to collect the research papers that describe the design, implementation, and/or evaluation of EHR-based IVTs that support clinical decision-making. Databases included PubMed, the ACM Digital Library, the IEEE Library, and Google Scholar. These sources were supplemented by gray literature searching and reference list reviews. Of the 946 initially identified articles, the survey analyzes 19 IVTs described in 24 articles that met the final selection criteria. The survey includes an overview of the goal of each IVT, a brief description of its visualization, and an analysis of how sub-activities, tasks, and sub-tasks blend and combine to accomplish the tool’s main higher-level activities of interpreting, predicting, and monitoring. Our proposed framework shows the gaps in support of higher-level activities supported by existing IVTs. It appears that almost all existing IVTs focus on the activity of interpreting, while only a few of them support predicting and monitoring—this despite the importance of these activities in assisting users in finding patients that are at high risk and tracking patients’ status after treatment.

1. Introduction

An electronic health record (EHR) contains patient data, such as demographics, prescriptions, medical history, diagnosis, surgical notes, and discharge summaries. Healthcare providers use EHRs to make critical decisions, study the effects of treatments, determine the effectiveness of treatments, and monitor patient improvement after a particular treatment. In addition to these benefits, EHRs can potentially aid clinical researchers in detecting hidden trends and missing events, revealing unexpected sequences, reducing the incidence of medical errors, and establishing quality control [1,2]. Recently, several healthcare organizations have used systems that incorporate EHR data to improve the quality of care; these systems are intended to replace traditional paper-based medical records [3]. However, a few studies reveal that these EHR-based systems hardly improve the quality of care. One of the reasons for this is that they do not allow for human–data interaction in a manner that fits and supports the needs of healthcare providers [4,5]. A set of technologies and techniques that can improve the efficacy and utility of these EHR-based systems can be found in information visualization [5], or broadly speaking interactive visualization tools (IVTs).
IVTs can be defined as computational technologies that use visual representations (i.e., visualizations) to amplify human cognition when working with data [6,7]. IVTs can help people who use them gain better insight by providing the means to explore the data at various levels of granularity and abstraction. An important feature of IVTs that makes them suitable for the exploration of EHRs is the ability to show relevant data quickly by mapping it to visualizations [5]. Another feature is interaction. Making the visualization interactive allows healthcare providers to perform various data-driven tasks and activities. Interaction helps users accomplish their overall goals by dynamically changing the mapping, view, and scope of EHR data. In recent years, a number of EHR-based IVTs have been developed and deployed to support healthcare providers in performing data-driven activities.
To provide a clear and systematic approach in examining EHR-based IVTs for clinical decision support, this paper provides a framework for analyzing tasks and activities supported by these tools. To do so, we will first provide a brief survey of some of the existing IVTs that support the exploration and querying of EHR data and examine overall patterns in these tools. This survey does not include EHR-based IVTs that are designed for clinical documentation, administration, and billing processes.
There are a few studies that review EHR-based IVTs and their applications. Rind et al. [5] reviewed and compared state-of-the-art information visualization tools that involve EHR data using four criteria: (1) data types that they cover, (2) support for multiple variables, (3) support for one versus multiple patient records, and (4) support for user intents. Lesselroth and Pieczkiewicz [8] surveyed different visualization techniques for EHRs. They cover a large number of visualization tools (e.g., Lifelines, MIVA, WBIVS, and VISITORS). Their survey is organized into five sections: (1) multimedia, (2) smart dashboards to improve situational awareness, (3) longitudinal and problem-oriented views to tell clinical narratives, (4) iconography and context links to support just-in-time information, and (5) probability analysis and decision heuristics to support decision analysis and bias identification. Combi et al. [9] reviewed a few visualization tools (e.g., IPBC, KHOSPAD, KNAVE II, Paint Strips, and VISITORS) and described them based on the following features: subject cardinality (single/multiple patients), concept cardinality (single/multiple variables), abstraction level (raw data, abstract concepts, knowledge), and temporal granularity (single, single but variable, multiple). Finally, in a book chapter, Aigner et al. [10] described strategies to visualize (1) clinical guidelines seen as plans (e.g., GEM Cutter, DELT/A), (2) patients’ data seen as multidimensional information space (e.g., Midgaard, VIE-VISU, Gravi++), and (3) patients’ data related to clinical guidelines (e.g., Tallis Tester, CareVis).
A careful examination of the above surveys shows that a systematic analysis of IVTs with a focus on how they support EHR-data-driven tasks and activities is lacking. The purpose of the current paper is to fill this gap. Here, we present a framework for analyzing how IVTs can support different EHR-based tasks and activities. The framework can help designers and researchers to conceptualize the functionalities of EHR-based IVTs in an organized manner. In addition, this paper is suggestive of how this framework can be used to evaluate existing EHR-based IVTs and design new ones systematically. This paper also leads to the development of best practices for designing similar frameworks in similar areas.
The rest of this paper is organized as follows. Section 2 discusses how the proposed framework is formed and examines the relationships among the three concepts of activities, tasks, and low-level interactions in the context of the framework. Section 3 presents our strategy for searching relevant literature and explains our selection criteria. Section 4 provides a brief survey of a set of IVTs and outlines their main goal(s). In this section, using the proposed analytical framework, we identify the tasks and activities that IVTs support. Finally, Section 5 discusses how the framework can be used to evaluate the surveyed EHR-based IVTs.

2. A Proposed Activity and Task Analysis Framework

In the context of IVTs, user-tool interaction can be conceptualized as actions that are performed by users and consequent reactions that occur via the tool’s interface. This bi-directional relationship between the user and the tool supports the flow of information between the two. Interaction allows for human–information discourse [11]. Furthermore, it allows users to adjust different features of the IVT to suit their analytical needs. Interaction can be characterized at different levels of granularity [7,12]. As displayed in Figure 1, an activity can be conceptualized at the highest level, where it is composed of multiple lower-level tasks (e.g., ranking, categorizing, and identifying) that work together to accomplish the activity’s overall goal. An activity and a task can consist of multiple sub-activities and sub-tasks, respectively. At the lower level, tasks can be considered to have visual and interactive aspects; tasks that are supported by visual processing are called visual tasks. For instance, consider a scenario in which a user is working with a stacked bar chart that aggregates laboratory test results. The user needs to understand the distribution of a specific test of a collection of patients after surgery over time. Some of the visual tasks that the user may need to perform can include detecting the time when the test is at its peak and observing the average test result at different times. Interactive tasks require users to act upon visualizations. For instance, in the example above, the user may want to cluster the test results based on different time granularities (e.g., over an hour, over a day, or over a month). Each interactive task is made up of a number of lower-level actions (i.e., interactions) that are carried out to complete the task.
In most complex situations, activities, sub-activities, tasks, and sub-tasks are combined to support users in accomplishing their overall goal. It is important to note two perspectives from which we can view human–data discourse. From a top-down perspective, users’ goals flow from higher-level activities that need to be accomplished. From here, we go down to a number of tasks and sub-tasks (visual and interactive), and then to a set of low-level interactions. From a bottom-up perspective, the performance of a series of low-level interactions that users perform with visual representations gives emergence to tasks. Similarly, the performance of a sequence of tasks gives emergence to activities all the way up until an overall goal is accomplished.
In this paper, we present an activity and task analysis framework for examining EHR-based IVTs (i.e., ones that involve EHRs as their main source of data with which users perform data-driven tasks and activities). To identify what activities, sub-activities, tasks, and sub-tasks are supported in EHR-based IVTs, we have examined a number of such tools that have been developed by different researchers and have been reported in the literature (see Wang et al. [13]; Wongsuphasawat et al. [14]; Wongsuphasawat and Gotz [15]; Malik et al. [16]; Fails [17]; Klimov et al. [18]; Wongsuphasawat [19]; Monroe et al. [20]; Brodbeck et al. [21]; Chittaro et al. [22]; Rind et al. [23]; Plaisant et al. [24]; Faiola and Newlon [25]; Pieczkiewicz et al. [26]; Bade et al. [27]; Hinum et al. [28]; Rind et al. [29]; and Ordonez et al. [30]; Gresh et al. [31]; Horn et al. [32]). To conceptualize and develop the elements of the framework, our focus is the identification of activities and tasks that are independent of any specific technology or platform. To be consistent, we re-interpret how activities and tasks are named by the authors of the afore-listed sources in light of the unified language of our proposed framework. The activity and task terms we use might differ from the language of the existing literature since the authors have described their tools using their own vocabulary. Unfortunately, the language that different authors use is not consistent. Such inconsistency makes it difficult to analyze how well and comprehensively such tools support EHR-based tasks and how they can be improved. In the next section, we define and categorize the higher-level activities that result from interaction and combination of different sub-activities, tasks, and sub-tasks.

2.1. Higher-Level Activities: Interpreting, Predicting, and Monitoring

After reviewing numerous papers [33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49], we have concluded that, broadly speaking, all EHR-data-driven healthcare activities can be organized under three main categories: interpreting [33,34,35,36,37], predicting [38,39,40,41,42,43], and monitoring [44,45,46,47,48,49]. Interpreting refers to the activity of detecting patterns from patients’ medical records and making sense of the relationships among different features. Predicting refers to the activity of anticipating patient outcomes and creating new hypotheses by analyzing patient history and status [50]. Lastly, monitoring refers to the activity of repetitive testing with the aim of adjusting and guiding the management of recurrent or chronic diseases [51].

2.2. Hierarchical Structure of Activities, Sub-Activities, Tasks, and Sub-Tasks

In this section, we identify sub-activities, tasks, and sub-tasks that blend and combine together to give rise to the three activities of interpreting, predicting, and monitoring. Interpreting, as a higher-level activity, can be comprised of four sub-activities: (i) understanding (e.g., gaining insight into patient medical records), (ii) discovering (e.g., finding patients with interesting medical event patterns), (iii) exploring (e.g., observing patient data in different temporal granularities), and (iv) overviewing (e.g., providing compact visual summaries of all event sequences found in the data). Likewise, predicting can be comprised of two sub-activities: (i) learning (e.g., generating new hypotheses from the data), and (ii) discovering (e.g., recognizing the deterioration of the disease). Finally, monitoring is composed of (i) investigating (e.g., examining the development of a patient after treatment), (ii) analyzing (e.g., studying the aggregated event sequences for quality assurance), and (iii) evaluating (e.g., assessing the quality of care based on clinical parameters). At the next level of the hierarchy, as shown in Figure 2, each sub-activity can be composed of a number of visual (e.g., specifying, recognizing, and detecting) as well as interactive tasks (e.g., locating, ordering, querying, and clustering). Moreover, as shown in Table 1, each task consists of different sub-tasks; for instance, ordering can be carried out by a combination of sub-tasks such as ranking, aggregating, identifying, and classifying.

3. Methods

3.1. Search Strategy

We conducted an electronic literature search in order to collect the research papers that describe the design, implementation, or evaluation of EHR-based IVTs. In order to assure a comprehensive document search, we included all the keywords that are relevant to the goal of the research and also covered all the synonyms and related terms, both for EHRs and visualization tools. We further broadened our search by adding an * to the end of a term to make sure the search engines picked out different variations of the term. We also added quotation marks around phrases to ensure that the exact sequence of words is found. To ensure that relevant papers were not missed in our search, we used a relatively large set of keywords. We used two categories of keywords. The first category concerned visualization tools and included the following terms: “visualization*”, “visualization tool*”, “information visualization*”, “interactive visualization*”, “interactive visualization tool*”, “visualization system*”, and “information visualization system*”. For the second category, EHR, we used the following terms: “Health Record*”, “Electronic Health Record*”, “EHR*”, “Electronic Patient Record*”, “Electronic Medical Record*”, “Patients Record*”, and “Patient Record*”. As we were looking for papers about EHR-based visualization tools, we used the keywords shown in Table 2.
We used the following search engines based on their relevance to the field: PubMed, the ACM Digital Library, the IEEE Library, and Google Scholar. We also looked for relevant papers in two medical informatics journals (International Journal of Medical Informatics and Journal of the American Medical Informatics Association). Furthermore, additional papers were collected in conference proceedings (e.g., IEEE Conference on Visual Analytics Science and Technology (VAST), HCIL Workshop 2015, and IEEE VisWeek Workshop on Visual Analytics in Health Care) that were published in 2007 and later. We then manually reviewed the reference lists of the papers that met the selection criteria to find other relevant studies that had not been identified in the database search. All the studies included in this survey were published from 1998 until 2015. We reviewed all of the abstracts, removed the duplicates, and shortlisted abstracts for a more detailed assessment.

3.2. Selection Criteria

Out of all the studies that survived the initial filtering, we only included those that described an interactive visualization tool and provided a detailed description of the tool’s visualization and its interaction design in order to analyze how the tool can support different EHR-data-driven tasks and activities. All the papers related to the visualization of any administrative tasks with patient data, medical guidelines, genetics data, and syndromic surveillance were excluded from our survey as we only focused on clinical EHR data. We also excluded the studies that were solely focused on the visualization of free text (e.g., the patient’s progress notes) and medical images (e.g., magnetic resonance imaging, and X-ray images).

3.3. Results

A total of 912 articles were identified from our initial search of electronic databases. A search of the gray literature and manually searching references from articles resulted in an additional 34 papers. We removed a total number of 205 duplicates that were included in the 946 articles, both within and between search engines. We then reviewed all the abstracts and excluded 685 further articles. Next, we read the full text of 56 remaining articles and excluded the ones that did not meet the selection criteria. Finally, 24 studies remained for the analysis. The results of the selection procedure are displayed in the flow diagram in Figure 3.

4. Survey of the Interactive Visualization Tools

In this section, we provide a survey of 19 IVTs that are described in the chosen articles and use our proposed activity and task framework to analyze them. The survey includes an overview of the goal of the IVT, a brief description of its visualization, and an analysis of how sub-activities, tasks, and sub-tasks blend and combine to accomplish the tool’s main higher-level activities of interpreting, predicting and, monitoring. A very important criterion to differentiate IVTs is whether they support activities that involve multiple patient records or exploration of an individual patient. We divide our survey into two different types of IVTs based on this criterion: population-based tools and single-patient tools. Initially, studies were focused on single-patient tools, but since 2010, most of the IVTs are developed to support large numbers of patient records. Our survey includes more population-based tools, as it seems that these are more prevalent than single-patient tools. For the first type, we survey 14 tools, and, for the second type, we survey five tools.

4.1. Population-Based Tools

Population-based IVTs support data-driven activities that involve multiplicity of patient records in aggregate form and simultaneously. Although these types of tools display fewer details about a particular patient, they provide users with the ability to recognize patterns, detect anomalies, find desired records, and cluster and aggregate records into different groups. In this section, we survey fourteen population-based IVTs.

4.1.1. Lifelines2

Lifelines2 [13,52] enables users to explore and analyze a set of temporal categorical patient records interactively. As shown in Figure 4, each record is represented by a horizontal strip containing patient ID and multiple events in patient history that occur at various times. Each event shows up as a color-coded triangle icon on a horizontal timeline. Lifelines2 allows the detection of temporal patterns and trends across EHRs to facilitate hypothesis generation and identify cause-and-effect relationships between patient records.
This tool supports the activity of interpreting by allowing users to get a better understanding of clinical problems and discovering patients with interesting medical event patterns. It also supports monitoring by investigating the impact of hospital protocol changes in patient care. It allows for temporal ordering of event sequences, observing the distribution of temporal events, and locating records with particular event sequences. These tasks (ordering, observing, locating) are supported by sub-tasks such as ranking, aggregating, and identifying.

4.1.2. Lifeflow

Lifeflow [14,53] provides a visual summary of the exploration and analysis of event sequences in EHR data. While in Lifelines2, due to limited screen space, it is not possible to see all records simultaneously; Lifeflow gives users the ability to answer questions that require an overview of all the records. To convert from Lifelines2 view to Lifeflow, a data structure called “tree of sequences” is created by aggregating all the records. This structure is then converted into a Lifeflow view with each node representing an event bar. Figure 5 shows Lifeflow visualization where all the records are vertically stacked on the horizontal timeline and all the events are represented using color-coded triangles.
In this IVT, the sub-activities of exploring and overviewing medical events support the activity of interpreting, while analyzing aggregated event sequences for quality assurance supports the activity of monitoring. Recognizing patterns and temporal ordering of aggregated event sequences are two tasks that enable Lifeflow to support exploring, overviewing, and analyzing sub-activities. Finally, sub-tasks such as aggregating, identifying, and classifying work together to accomplish higher-level tasks.

4.1.3. Eventflow

Eventflow [20] provides users with the ability to query, explore, and visualize interval data interactively. It allows pattern recognition by visualizing events in both a timeline that displays all individual records and an aggregated overview that shows common and rare patterns. As displayed in Figure 6, all the records are shown on a scrollable timeline browser. On the horizontal timeline, point-based events are displayed as triangles, while interval events are represented by the connected rectangles. In the center, an aggregated display gives users an overview of all event sequences in EHR data. The aggregation method works exactly like the one in Lifeflow, but it has been extended to work for interval events in the Eventflow. All the records with the same event sequence are aggregated into a single bar and the average time between two events among the records in the group is represented by the horizontal gap between two bars.
This tool supports interpreting by providing an overview of all event sequences found in the data and exploring medical events (point-based events as well as interval events). The overviewing and exploring sub-activities can be accomplished by recognizing temporal patterns and simplifying temporal event sequences. Monitoring can be accomplished by investigating aggregated event sequences. The investigating sub-activity is supported by detecting anomalies in the data. Eventflow supports predicting by learning new hypotheses where this sub-activity can be carried out by tasks such as specifying temporal patterns and simplifying temporal event sequences. Aggregating, identifying, classifying are the lowest-level sub-tasks for Eventflow.

4.1.4. Caregiver

Caregiver [21] is an IVT that supports therapeutic decision making, intervention, and monitoring. As displayed in Figure 7, the tool has three different views where the upper view displays the duration and size of the patient groups that are chosen by physicians to receive interventions. A common timeline for each patient is shown in the lower view of the chosen attributes. Caregiver allows users to create new cohorts from the search results based on a combination of values of any number of variables.
In this tool, the activity of interpreting can be accomplished by discovering trends, critical incidents, and cause–effect relationships. Caregiver also supports predicting by allowing users to learn about the deterioration in the status of a disease. It supports these sub-activities (discovering and learning) by specifying temporal relationships and clustering. Specifying and clustering can be carried out by sub-tasks such as identifying, classifying, and ranking.

4.1.5. CoCo

CoCo [16,54] is an IVT for comparing cohorts of sequences of events recorded in EHRs. It provides users with overview and event-level statistics of the chosen dataset along with a list of available metrics to generate new hypotheses. It consists of a file manager pane, a dataset statistics pane, an event legend, a list of available metrics, the main window, and options for filtering and sorting the results (as shown in Figure 8). The summary panel includes high-level statistics containing the total number of records and events in each record.
CoCo supports the activity of interpreting by allowing users to explore and investigate two groups of temporal event sequences simultaneously. The activity of predicting can be accomplished by learning new hypotheses from the statistical analysis while comparing the event sequences (i.e., detecting differences among groups of patients). Ranking, classifying, and identifying are the lowest-level sub-tasks in CoCo.

4.1.6. Similan

Similan [19] is a tool that provides users with the ability to discover and explore similar records in the temporal categorical dataset. Records are ranked by their similarity to a target record that can be either a reference record or a user’s specified sequence of events. The similarity measure considers the transposition of events, addition, removal, and temporal differences of matching to estimate the similarity of temporal sequences. Simian lets users to visually compare the selected target with a set of records and rank those records based on the matching score, as shown in the left side middle panel in Figure 9.
In this IVT, interpreting can be carried out by exploring and discovering similar records in temporal categorical data where these sub-activities themselves are supported by detecting (calculating similarity measure among records) and recognizing similarity among records. Predicting is accomplished by discovering patients with similar symptoms to a certain target patient. The sub-activity discovering can be carried out by tasks such as temporal ordering and dynamic query. Finally, sub-tasks such as ranking, identifying, and classifying work together to accomplish higher-level tasks.

4.1.7. Outflow

Outflow [15,55] is a graph-based visualization that shows the eventual outcome across the event sequences in patient records. It aggregates and displays event progression pathways and their corresponding properties, such as cardinality, outcomes, and timing. The tool allows users to interactively analyze the event sequences and detect their correlation with external factors (e.g., beyond the collection of event types that specify an event sequence). The tool is a state transition diagram, which is represented by a directed acyclic graph. The states (nodes) are unique combinations of patient symptoms that are mapped to rectangles, where the height of each rectangle is proportional to the number of patients. The graph is divided into different layers vertically, where layer i consists of all states in the graph with i symptoms. These layers are arranged from left to right, displaying patient history from past to future. Edges display transitions among symptoms where each edge encodes the number of patents that are involved in the transition and the average time interval between different states. The end state that is represented by a trapezoid followed by a circle is used to mark points where the patient paths have ended. Finally, the color of the edges and end states represents the average outcome for the corresponding group of patients.
In this tool, sub-activities of exploring and overviewing event sequences work together to accomplish the activity of interpreting. Outflow also supports predicting by allowing users to discover the progression of temporal event sequences. The sub-activities of exploring, overviewing, and discovering can be accomplished by summarizing temporal event sequences, specifying temporal relationships, and detecting patterns from statistical summaries. Finally, aggregating, identifying, and classifying are the lowest-level sub-tasks.

4.1.8. IPBC

IPBC [22] (interactive parallel bar charts) is an interactive 3D visualization of temporal data. IPBC applies visual data mining to a real medical problem such as the management of multiple hemodialysis sessions. It provides users with the ability to make various decisions regarding such things as therapy, management, and medical research. Each time series is displayed as a 3D bar chart where one of the horizontal axes shows time and the vertical axis represents the value, as displayed in Figure 10. Lined up bar charts on the second horizontal axis enable users to view all the series simultaneously.
IPBC supports interpreting by allowing users to explore patient data interactively. Monitoring can be carried out by evaluating the quality of care based on certain clinical parameters. The sub-activities of exploring and evaluating are supported by specifying temporal relationships and recognizing similar patterns where these tasks themselves can be accomplished by sub-tasks such as identifying, classifying, and ranking.

4.1.9. Gravi++

Gravi++ [28] allows users to explore and analyze multiple categorical variables using interactive visual clustering. This tool uses a spring-based layout to place both patient and variable icons across the visualization, where the value of a variable for a patient identifies the distance between that patient’s icon and the variable’s icon. Gravi++ provides users with the ability to detect clusters since patients with similar values are placed together on screen. In order to visualize the exact values of each variable for each patient, the tool shows each patient’s value as a circle around variables. The patient icons are represented by spheres while the variable icons are encoded by squares. Moreover, the tool can encode different patient attributes using patient icons; for instance, the size of the sphere can be mapped to the body mass index of the patient and its color can encode the patient’s gender or therapeutic outcome.
This tool supports the activity of interpreting by allowing users to explore patient data and discover clusters of similar patients. Monitoring can be accomplished by investigating the development of a patient after a certain treatment. The sub-activities of exploring, discovering, and investigating are supported by tasks such as recognizing patterns and specifying temporal relationships. Finally, identifying and classifying are the lowest-level sub-tasks that are supported by the tool.

4.1.10. PatternFinder

PatternFinder [17] is a query-based tool for data visualization and visual query that can help users search and discover temporal patterns within multivariate categorical data. PatternFinder allows users to specify queries for temporal events with time span and value constraints and enables them to look for temporally ordered events/values/trends as well as the existence of events. Also, users can set a range of possible time spans among the events to specify how far apart the events are from each other. The tool has two main panels: the pattern design and query specification panel and the result visualization panel. The leftmost part of the pattern design panel is the Person/People panel that enables users to limit the types of patients by name, by choosing from a list of patients, or by typing a text string. Any modifications that are done in this panel are dynamic queries that lead to an immediate update of the results in the result visualization panel. The temporal panel that is placed to the right of the Person/People panel enables users to form temporal pattern queries by chaining the events together. Users are able to search for the presence of events, the temporal sequence of events (e.g., an emergency doctor’s visit followed by a hospitalization), the temporal sequence of values (e.g., 200 or below cholesterol followed by 240 or higher), and the temporal value patterns (e.g., monotonically decreasing). The result visualization panel displays a graphical table of all the matches where each row shows a single pattern match for one patient. Pattern matches are represented as a timeline in a "ball-and-chain" visualization fashion where the event points are shown as circles and time spans are displayed by blue bars between the events. The color of the event point in the result visualization panel matches the color of the associated event in the query specification panel. All the events that match the query pattern specified by users are linked together by horizontal lines.
In this tool, the activity of interpreting is supported by discovering patterns and exploring patient data dynamically, where these sub-activities themselves can be carried out by tasks such as specifying temporal relationships and issuing dynamic queries. Identifying and ranking are the two low-level sub-tasks that work together to support the aforementioned tasks.

4.1.11. TimeRider

TimeRider [23] offers an animated scatter plot to help users discover patterns in irregularly sampled patient data covering several time spans. As shown in Figure 11, time is represented by either traces or animation in TimeRider. Color, shape, and size of marks are used to encode up to three additional variables. Users can compare patient records of different time spans by synchronizing patients’ age, calendar date, and the start and end of the treatment.
This tool supports interpreting by allowing users to detect trends, clusters, and correlations and providing them with an overview to visually compare patient data in parallel. The sub-activities of detecting and overviewing can be carried out by tasks such as specifying temporal relationships, clustering, and recognizing patterns. Identifying and aligning are the sub-tasks that work together to support the aforementioned tasks.

4.1.12. VISITORS

VISITORS [18,56] is an IVT that allows for exploration, analysis, and retrieval of raw temporal data. The tool uses raw numerical data (e.g., white blood cell counts) across time to derive temporal abstractions (e.g., durations of low, normal, or high blood-cell-count levels for patients). It then uses lower-level temporal abstractions in conjunction with raw data to generate higher-level abstractions. Finally, patient groups’ values are aggregated and displayed. Figure 12 shows this tool’s visualization environment, where raw numerical data is represented by line charts, whereas categorical data is displayed as tick marks or bars on a horizontal zoomable timeline.
In this tool, the activity of interpreting is supported by exploring patient data in different temporal granularities. The sub-activity of exploring can be carried out by tasks such as specifying relationships, observing the distribution of aggregated values of a group of patients, and locating records based on specific time and value constraints. VISITORS supports the activity of monitoring by sub-activities, such as investigating treatment effects, clinical trial results, and quality of clinical management processes. The latter sub-activity, investigating, can be carried out by the task of recognizing patterns as well as all the other tasks needed to support the activity of interpreting. Finally, aggregating, classifying, aligning, and identifying are the lowest-level sub-tasks that are supported by this tool.

4.1.13. Prima

Prima [31] is a population-based IVT that allows users to explore the categorical and numerical data by constructing different linked views. This helps users to not only understand the large set of patient records but also discover patterns and trends in the dataset. The aggregated window provides an overview of the categorical variables by showing the proportions of patients in each category for those variables using stacked bar charts. This window enables users to filter patients by applying a color “brush”. It also displays correlations among different categorical variables through interactive coloring. Another view displays a histogram of numerical variables. The data can also be explored with a 2D scatter plot. Another view of the data is called multiple category tables. It shows the values of either a single variable or multiple categories. Finally, the tool incorporates the Kaplan–Meier curve to estimate the survival function from the patient data.
Prima supports the activity of interpreting by allowing users to explore patient data interactively, where this sub-activity itself can be accomplished by recognizing patterns and specifying temporal relationships. Finally, aggregating and ranking are the lowest-level sub-tasks that are supported by the tool.

4.1.14. WBIVS

WBIVS [26] is a web-based interactive tool that visualizes numerical and categorical variables for lung transplant home monitoring data. Numerical variables are displayed in line plots, while categorical variables are visualized in matrix plots. The tool visualizes ten variables in total. When a data point gets selected, all the other data points that belong to the same time period will get highlighted in the other charts. Moreover, users can find details about the last two chosen data points on the right part of the graph.
This tool supports the interpreting activity by allowing users to explore patient data interactively and discover patterns. Monitoring is supported by investigating treatment effects. The exploring and discovering sub-activities can be accomplished by tasks such as specifying temporal relationships among data points and organizing data for pattern recognition. These tasks can be composed of lowest-level sub-tasks, such as identifying, classifying, and highlighting.

4.2. Single-Patient Tools

Single-patient IVTs provide visualizations of one single-patient record at a time. These tools enable users to overview a given patient’s historical data, detect important events in the patient’s history, and recognize trends. In this section, we survey five single-patient IVTs.

4.2.1. Midgaard

Midgaard [27] allows for exploration of the intensive care units’ data at different levels of abstraction from overview to details. It uses visualizations to display numerical variables of treatment plans. It incorporates a complex semantic zoom method for numerical variables by calculating their categorical abstractions based on the available screen area and zoom level. Midgaard provides users with the ability to switch between different views such as a colored background, colored bars, area charts, or augmented line charts based on the level of details. The tool can progressively switches to a more detailed view to display all the individual data points when users zoom in or switch back to more compact graphical elements when they zoom out.
Midgaard can also visualize medical treatment plans using colored bars where each bar can contain further bars displaying sub-plans. It allows users to navigate and zoom by interacting with two time axes that are placed below the visualization area. The bottom axis displays a temporal overview of the patient record while the middle axis allows users to see specific time intervals in more detail.
The activity of interpreting is supported by exploring patient data at different levels of abstraction, where this sub-activity itself can be accomplished by tasks such as recognizing fluctuations in data. Identifying and classifying are the two sub-tasks that are supported by this tool.

4.2.2. MIVA

MIVA [25] (Medical information visualization assistant) is a tool that transforms and organizes biometric data into temporal resolutions to provide healthcare providers with contextual knowledge. It allows users to prioritize and customize visualizations based on specific clinical problems. It visualizes the data using point plots to display temporal changes in numerical values, where each variable is represented by a separate plot, as shown in Figure 13. MIVA enables users to detect changes in multiple physiological data points over time for faster and more accurate diagnosis. Users can control the data source, time resolutions, and time periods to narrow down the assessment of a patient’s condition.
This tool supports the activity of interpreting by enabling users to carry out sub-activities such as exploring longitudinal relationships in patient data where this sub-activity can be accomplished by tasks such as specifying temporal relationships and recognizing patterns. At the level of sub-tasks, this tool supports identifying as well as classifying.

4.2.3. VIE–VISU

VIE–VISU [32] uses a set of glyphs to display changes in a patient’s status over time in intensive care. Each glyph’s geometrical shape and color encodes categorical variables, while the numerical variables are represented by size of the glyph’s elements. Every glyph can encode 15 variables that are classified by physiological systems. For instance, the respiratory parameters are mapped to a rectangle in the middle of the glyph; circulatory parameters are mapped to a triangle on top of the glyph, and the fluid balance parameters are shown by two smaller rectangles at the bottom of the glyph. By default, the tool displays 24 glyphs, one per hour.
The activity of interpreting can be accomplished by overviewing a patient’s status, where this sub-activity is supported by tasks such as recognizing patterns. This tool supports monitoring by evaluating changes in patient’s status over time. The task of identifying temporal relationships supports the sub-activity of evaluating. Finally, aggregating and classifying are two sub-tasks that can be carried out by the tool.

4.2.4. Lifelines

Lifelines [24] offers a visualization environment to show patient history on a zoomable timeline, where a patient’s medical record is displayed by a set of events and lines. Episodes and events in a patient record are represented by a set of multiple line segments as shown in Figure 14. Color can be used to encode the states of categorical variables. This IVT provides an overview of a patient history to recognize trends, specify important events, and detect omissions in data.
The activity of interpreting is supported by understanding patient’s status where this sub-activity itself can be carried out by tasks such as recognizing patterns and specifying temporal relationships. The tool supports monitoring by allowing users to carry out sub-activities such as investigating trends and anomalies in patient data. The investigating sub-activity is supported by outlining and summarizing the patient data. Finally, aggregating, classifying, and identifying are the sub-tasks that are supported by the tool.

4.2.5. VisuExplore

VisuExplore [29,57] displays patient data in different views aligned with a horizontal timeline, where each view shows multiple variables. This IVT uses common visualization techniques that make it easy to use and learn. In this tool, numerical data are displayed using bar charts and line plots, whereas categorical data are represented using event charts and timeline charts, as shown in Figure 15.
In this tool, the activity of interpreting is supported by exploring temporal data of patients with chronic diseases, where this sub-activity can be carried out by tasks such as specifying temporal relationships. Finally, aligning and identifying are two sub-tasks that can be carried out by the tool.

5. Discussion and Limitations

In this paper, we have presented and proposed a framework to identify and analyze EHR-data-driven tasks and activities in the context of IVTs—that is, all the activities, sub-activities, tasks, and sub-tasks that are supported by EHR-based IVTs. Using a survey of 19 EHR-based IVTs, we demonstrate how these IVTs support activities by identifying the combination of sub-activities, tasks, and sub-tasks that work together to help users carry out the three higher-level activities as displayed in Table 3. Interpreting is supported by all IVTs surveyed in this paper. Eventflow, Similan, CoCo, Outflow, and Caregiver are the only IVTs that support predicting, whereas Lifelines2, Lifeflow, Eventflow, Gravi++, IPBC, TimeRider, VISITORS, WBIVS, VIE-VISU, Lifelines, CoCo, and Visu-Explore are the tools that facilitate monitoring. Going down from high-level activities, recognizing patterns and specifying temporal relationships are the most common sub-activities that help users with the activity of interpreting in most of the IVTs. The existing EHR-based IVTs support predicting by giving users the ability to perform sub-activities such as learning new hypotheses, discovering patients with similar symptoms to a target patient, and detecting early deterioration of a disease. Finally, the most common sub-activities that facilitate monitoring are evaluating the quality of care and investigating the development of a patient’s status after treatment.
Our proposed framework can offer a number of benefits for designers, researchers, and evaluators of EHR-based IVTs. Firstly, the framework can help the designer to conceptualize activities, tasks, and sub-tasks of EHR-based IVTs systematically. Secondly, it can assist researchers in making sense of IVTs by providing them with all the activities that can be accomplished by carrying out different sets of sub-activities, tasks, and sub-tasks. Thirdly, this framework can be used by evaluators to identify the gaps in support of higher-level activities supported by existing IVTs. It appears that almost all existing IVTs focus on the activity of interpreting, while only a few of them support predicting despite the importance of this activity in supporting users to find the patients that are at high risk and identify the risk factors of various diseases. Also, some of the EHR-based IVTs do not pay enough attention to monitoring, even though this activity is beneficial in investigating the quality of clinical management processes. All these higher-level activities should be an integral part of a properly designed EHR-based IVT since healthcare providers use such tools to (1) better understand patients’ condition, (2) anticipate the course of a specific disease, and (3) track patients’ condition after treatment. Most of the tools surveyed in this paper can only satisfy a certain aspect of users’ needs. According to a recent survey in the US, 40% of the clinicians are not satisfied with the existing EHR-based system [58]. Therefore, a framework is needed to guide the designer of an IVT in choosing which activities, tasks, and sub-tasks the tool should support. Using questions such as, "What activities can users accomplish by executing a set of tasks?" or "What tasks should be supported to provide users with the ability to perform their activities?", we demonstrate how the proposed framework can be used by designers of EHR-based IVTs to systematically conceptualize and design the tasks and activities of such tools. Given the framework, all designers need to know is, which low-level sub-tasks, tasks, and sub-activities to select and how to blend and combine them to support higher-level activities and allow users to accomplish their overall goal. For instance, if a designer wants to design an IVT to monitor an infant’s condition in the neonatal intensive care unit, they can choose different sets of sub-activities, such as investigating the effect of a specific treatment or evaluating changes in infant’s status over time. Then, the designer selects a combination of tasks such as the temporal ordering of event sequences or displaying the distribution of temporal events to support the chosen sub-activities. Finally, a set of sub-tasks, such as ranking, aggregating, and identifying, are chosen to support the selected tasks.
We believe a successful EHR-based tool should be capable of doing more than just storing, retrieving, and exchanging patient data. It should support more complex activities, tasks, and sub-tasks to allow healthcare providers to accomplish their goals. Our proposed framework promises a new means for designers of EHR-based IVTs to understand the effectiveness of incorporating such activities, tasks, and sub-tasks in their tool. The use of our framework in EHR-based IVTs will also help physicians to make better treatment decisions and track changes in a patient’s condition over time.
This paper has three key limitations. First, we do not investigate the completeness and accuracy of the data sources that IVTs are using as our survey relies on the descriptions of the IVTs found in publications and video tutorials. Second, as the main goal of this paper is the analysis of EHR-based IVTs, we exclude tools that are mainly dependent on statistical and machine learning methods. Finally, we do not consider commercial tools in this paper. This is because online descriptions of such tools do not systematically and thoroughly cover the features of these tools, i.e., their visualizations, interactions, and results.
The findings of this paper will lead to the development of best practices for creating similar frameworks in other domains. A possible area of future research involves developing frameworks for visual analytics tools that incorporate automated analysis techniques along with interactive visualizations to support the increasingly large and complex datasets in EHRs.

Author Contributions

All authors have read and agree to the published version of the manuscript. Conceptualization, N.R., S.S.A., and K.S.; methodology, N.R., S.S.A., and K.S.; investigation, N.R., S.S.A. and K.S.; writing—original draft preparation, N.R., S.S.A. and K.S.; writing—review and editing, N.R., S.S.A., and K.S.; supervision, K.S.

Funding

This research received no external funding.

Acknowledgments

We would like to thank all authors and publishers who shared images of their tools with us.

Conflicts of Interest

The authors declare that there is no conflict of interest.

References

  1. Tang, P.C.; McDonald, C.J. Electronic health record systems. In Biomedical Informatics: Computer Applications in Health Care and Biomedicine; Health Informatics; Shortliffe, E.H., Cimino, J.J., Eds.; Springer: New York, NY, USA, 2006; pp. 447–475. ISBN 978-0-387-36278-6. [Google Scholar]
  2. Christensen, T.; Grimsmo, A. Instant availability of patient records, but diminished availability of patient information: A multi-method study of GP’s use of electronic patient records. BMC Med. Inform. Decis. Mak. 2008, 8, 12. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Boonstra, A.; Versluis, A.; Vos, J.F. Implementing electronic health records in hospitals: A systematic literature review. BMC Health Serv. Res. 2014, 14, 370. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Himmelstein, D.U.; Wright, A.; Woolhandler, S. Hospital computing and the costs and quality of care: A national study. Am. J. Med. 2010, 123, 40–46. [Google Scholar] [CrossRef] [PubMed]
  5. Rind, A.; Wang, T.D.; Aigner, W.; Miksch, S.; Wongsuphasawat, K.; Plaisant, C.; Shneiderman, B. Interactive information visualization to explore and query electronic health records. HCI 2013, 5, 207–298. [Google Scholar]
  6. Sears, A.; Jacko, J.A. The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications, 2nd ed.; CRC Press: Boca Raton, FL, USA, 2007; ISBN 978-1-4106-1586-2. [Google Scholar]
  7. Sedig, K.; Parsons, P. Design of visualizations for human-information interaction: A pattern-based framework. Synth. Lect. Vis. 2016, 4, 1–185. [Google Scholar] [CrossRef]
  8. Lesselroth, B.J.; Pieczkiewicz, D.S. Data Visualization Strategies for the Electronic Health Record; Nova Science Publishers, Inc.: Hauppauge, NY, USA, 2011; ISBN 978-1-61209-270-6. [Google Scholar]
  9. Combi, C.; Keravnou-Papailiou, E.; Shahar, Y. Temporal Information Systems in Medicine; Springer Science & Business Media: Berlin, Germany, 2010; ISBN 978-1-4419-6543-1. [Google Scholar]
  10. Aigner, W.; Kaiser, K.; Miksch, S. Visualization techniques to support authoring, execution, and maintenance of clinical guidelines. In Computer-Based Medical Guidelines and Protocols: A Primer and Current Trends; IOS Press: Amsterdam, The Netherlands, 2008; Volume 139, pp. 140–159. [Google Scholar]
  11. Ola, O.; Sedig, K. Discourse with visual health data: Design of human-data interaction. Multimodal Technol. Interact. 2018, 2, 10. [Google Scholar] [CrossRef] [Green Version]
  12. Sedig, K.; Parsons, P. Interaction design for complex cognitive activities with visual representations: A pattern-based approach. AIS Trans. Hum. Comput. Interact. 2013, 5, 84–133. [Google Scholar] [CrossRef] [Green Version]
  13. Wang, T.D.; Plaisant, C.; Quinn, A.J.; Stanchak, R.; Murphy, S.; Shneiderman, B. Aligning temporal data by sentinel events: Discovering patterns in electronic health records. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems-ACM, New York, NY, USA, 5–10 April 2008; pp. 457–466. [Google Scholar]
  14. Wongsuphasawat, K.; Guerra Gómez, J.A.; Plaisant, C.; Wang, T.D.; Taieb-Maimon, M.; Shneiderman, B. LifeFlow: Visualizing an overview of event sequences. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems-ACM, New York, NY, USA, 7–12 May 2011; pp. 1747–1756. [Google Scholar]
  15. Wongsuphasawat, K.; Gotz, D. Exploring flow, factors, and outcomes of temporal event sequences with the Outflow visualization. IEEE Trans. Vis. Comput. Graph. 2012, 18, 2659–2668. [Google Scholar] [CrossRef]
  16. Malik, S.; Du, F.; Monroe, M.; Onukwugha, E.; Plaisant, C.; Shneiderman, B. An evaluation of visual analytics approaches to comparing cohorts of event sequences. In Proceedings of the EHRVis Workshop on Visualizing Electronic Health Record Data at VIS, Paris, France, 9 November 2014; Volume 14. [Google Scholar]
  17. Fails, J.A.; Karlson, A.; Shahamat, L.; Shneiderman, B. A visual interface for multivariate temporal data: Finding patterns of events across multiple histories. In Proceedings of the 2006 IEEE Symposium On Visual Analytics Science And Technology IEEE, Baltimore, MD, USA, 31 October–2 November 2006; pp. 167–174. [Google Scholar]
  18. Klimov, D.; Shahar, Y.; Taieb-Maimon, M. Intelligent selection and retrieval of multiple time-oriented records. J. Intell. Inf. Syst. 2010, 35, 261–300. [Google Scholar] [CrossRef]
  19. Wongsuphasawat, K. Finding comparable patient histories: A temporal categorical similarity measure with an interactive visualization. In Proceedings of the IEEE Symposium on Visual Analytics Science and Technology (VAST), Atlantic City, NJ, USA, 11–16 October 2009. [Google Scholar]
  20. Monroe, M.; Lan, R.; Lee, H.; Plaisant, C.; Shneiderman, B. Temporal event sequence simplification. IEEE Trans. Vis. Comput. Graph. 2013, 19, 2227–2236. [Google Scholar] [CrossRef] [Green Version]
  21. Brodbeck, D.; Gasser, R.; Degen, M.; Reichlin, S.; Luthiger, J. Enabling large-scale telemedical disease management through interactive visualization. Eur. Notes Med. Inform. 2005, 1, 1172–1177. [Google Scholar]
  22. Chittaro, L.; Combi, C.; Trapasso, G. Data mining on temporal data: A visual approach and its clinical application to hemodialysis. J. Vis. Lang. Comput. 2003, 14, 591–620. [Google Scholar] [CrossRef]
  23. Rind, A.; Aigner, W.; Miksch, S.; Wiltner, S.; Pohl, M.; Drexler, F.; Neubauer, B.; Suchy, N. Visually exploring multivariate trends in patient cohorts using animated scatter plots. In Ergonomics and Health Aspects of Work with Computers; Robertson, M.M., Ed.; Springer: Berlin/Heidelberg, Germany, 2011; pp. 139–148. [Google Scholar]
  24. Plaisant, C.; Mushlin, R.; Snyder, A.; Li, J.; Heller, D.; Shneiderman, B. LifeLines: Using visualization to enhance navigation and analysis of patient records. Proc. Am. Med. Inform. Assoc. Annu. Fall Symp. 1998, 76–80. [Google Scholar] [CrossRef]
  25. Faiola, A.; Newlon, C. Advancing critical care in the ICU: A human-centered biomedical data visualization systems. In Proceedings of the International Conference on Ergonomics and Health Aspects of Work with Computers; Springer: Berlin/Heidelberg, Germany, 2011; pp. 119–128. [Google Scholar]
  26. Pieczkiewicz, D.S.; Finkelstein, S.M.; Hertz, M.I. Design and evaluation of a web-based interactive visualization system for lung transplant home monitoring data. Proc. AMIA Annu. Symp. Proc. Am. Med. Inform. Assoc. 2007, 2007, 598. [Google Scholar]
  27. Bade, R.; Schlechtweg, S.; Miksch, S. Connecting time-oriented data and information to a coherent interactive visualization. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems ACM, Vienna, Austria, 24–29 April 2004; pp. 105–112. [Google Scholar]
  28. Hinum, K.; Miksch, S.; Aigner, W.; Ohmann, S.; Popow, C.; Pohl, M.; Rester, M. Gravi++: Interactive information visualization to explore highly structured temporal data. J. UCS 2005, 11, 1792–1805. [Google Scholar]
  29. Rind, A.; Aigner, W.; Miksch, S.; Wiltner, S.; Pohl, M.; Turic, T.; Drexler, F. Visual exploration of time-oriented patient data for chronic diseases: Design study and evaluation. In Proceedings of the Symposium of the Austrian HCI and Usability Engineering Group; Springer: Berlin/Heidelberg, Germany, 2011; pp. 301–320. [Google Scholar]
  30. Ordonez, P.; Oates, T.; Lombardi, M.E.; Hernandez, G.; Holmes, K.W.; Fackler, J.; Lehmann, C.U. Visualization of multivariate time-series data in a neonatal ICU. IBM J. Res. Dev. 2012, 56, 7:1–7:12. [Google Scholar] [CrossRef]
  31. Gresh, D.L.; Rabenhorst, D.A.; Shabo, A.; Slavin, S. Prima: A case study of using information visualization techniques for patient record analysis. In Proceedings of the IEEE Visualization (VIS 2002), Boston, MA, USA, 27 October–1 November 2002; pp. 509–512. [Google Scholar]
  32. Horn, W.; Popow, C.; Unterasinger, L. Support for fast comprehension of ICU data: Visualization using metaphor graphics. Methods Inf. Med. 2001, 40, 421–424. [Google Scholar]
  33. Låg, T.; Bauger, L.; Lindberg, M.; Friborg, O. The role of numeracy and intelligence in health-risk estimation and medical data interpretation. J. Behav. Decis. Mak. 2014, 27, 95–108. [Google Scholar] [CrossRef]
  34. Groves, M.; O’Rourke, P.; Alexander, H. Clinical reasoning: The relative contribution of identification, interpretation and hypothesis errors to misdiagnosis. Med. Teach. 2003, 25, 621–625. [Google Scholar] [CrossRef]
  35. Auffray, C.; Balling, R.; Barroso, I.; Bencze, L.; Benson, M.; Bergeron, J.; Bernal-Delgado, E.; Blomberg, N.; Bock, C.; Conesa, A.; et al. Making sense of big data in health research: Towards an EU action plan. Genome Med. 2016, 8, 71. [Google Scholar] [CrossRef]
  36. Komaroff, A.L. The variability and inaccuracy of medical data. Proc. IEEE 1979, 67, 1196–1207. [Google Scholar] [CrossRef]
  37. Kumar, M.; Stoll, N.; Kaber, D.; Thurow, K.; Stoll, R. Fuzzy filtering for an intelligent interpretation of medical data. In Proceedings of the 2007 IEEE International Conference on Automation Science and Engineering, Scottsdale, AZ, USA, 22–25 September 2007; pp. 225–230. [Google Scholar]
  38. Amarasingham, R.; Patzer, R.E.; Huesch, M.; Nguyen, N.Q.; Xie, B. Implementing electronic health care predictive analytics: Considerations and challenges. Health Aff. 2014, 33, 1148–1154. [Google Scholar] [CrossRef] [PubMed]
  39. Cohen, I.G.; Amarasingham, R.; Shah, A.; Xie, B.; Lo, B. The legal and ethical concerns that arise from using complex predictive analytics in health care. Health Aff. 2014, 33, 1139–1147. [Google Scholar] [CrossRef] [PubMed]
  40. Kankanhalli, A.; Hahn, J.; Tan, S.; Gao, G. Big data and analytics in healthcare: Introduction to the special section. Inf. Syst. Front. 2016, 18, 233–235. [Google Scholar] [CrossRef] [Green Version]
  41. Wang, Y.; Kung, L.; Byrd, T.A. Big data analytics: Understanding its capabilities and potential benefits for healthcare organizations. Technol. Forecast. Soc. Chang. 2018, 126, 3–13. [Google Scholar] [CrossRef]
  42. Simpao, A.F.; Ahumada, L.M.; Gálvez, J.A.; Rehman, M.A. A review of analytics and clinical informatics in health care. J. Med. Syst. 2014, 38, 45. [Google Scholar] [CrossRef]
  43. Raghupathi, W.; Raghupathi, V. Big data analytics in healthcare: Promise and potential. Health Inf. Sci. Syst. 2014, 2, 3. [Google Scholar] [CrossRef]
  44. Saeed, M.; Lieu, C.; Raber, G.; Mark, R.G. MIMIC II: A massive temporal ICU patient database to support research in intelligent patient monitoring. Proc. Comput. Cardiol. 2002, 29, 641–644. [Google Scholar]
  45. Tia, G.; Greenspan, D.; Welsh, M.; Juang, R.R.; Alm, A. Vital signs monitoring and patient tracking over a wireless network. In Proceedings of the 2005 IEEE Engineering in Medicine and Biology, 27th Annual Conference, Shanghai, China, 31 August–3 September 2005; pp. 102–105. [Google Scholar]
  46. Hauskrecht, M.; Batal, I.; Valko, M.; Visweswaran, S.; Cooper, G.F.; Clermont, G. Outlier detection for patient monitoring and alerting. J. Biomed. Inform. 2013, 46, 47–55. [Google Scholar] [CrossRef]
  47. Anderson, H.D.; Pace, W.D.; Brandt, E.; Nielsen, R.D.; Allen, R.R.; Libby, A.M.; West, D.R.; Valuck, R.J. Monitoring suicidal patients in primary care using electronic health records. J. Am. Board Fam. Med. 2015, 28, 65–71. [Google Scholar] [CrossRef]
  48. Kho, A.; Rotz, D.; Alrahi, K.; Cárdenas, W.; Ramsey, K.; Liebovitz, D.; Noskin, G.; Watts, C. Utility of commonly captured data from an EHR to identify hospitalized patients at risk for clinical deterioration. AMIA Annu. Symp. Proc. 2007, 2007, 404–408. [Google Scholar]
  49. Li, X.; Wang, Y. Adaptive online monitoring for ICU patients by combining just-in-time learning and principal component analysis. J. Clin. Monit. Comput. 2016, 30, 807–820. [Google Scholar] [CrossRef] [PubMed]
  50. Siegel, E. Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die; John Wiley & Sons: Hoboken, NJ, USA, 2013; ISBN 978-1-118-35685-2. [Google Scholar]
  51. Glasziou, P.; Irwig, L.; Mant, D. Monitoring in chronic disease: A rational approach. BMJ 2005, 330, 644–648. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  52. Wang, T.D.; Plaisant, C.; Shneiderman, B.; Spring, N.; Roseman, D.; Marchand, G.; Mukherjee, V.; Smith, M. Temporal summaries: Supporting temporal categorical searching, aggregation and comparison. IEEE Trans. Vis. Comput. Graph. 2009, 15, 1049–1056. [Google Scholar] [CrossRef]
  53. Guerra Gómez, J.; Wongsuphasawat, K.; Wang, T.D.; Pack, M.; Plaisant, C. Analyzing incident management event sequences with interactive visualization. In Proceedings of the Transportation Research Board 90th Annual Meeting, Compendium of Papers, Washington, DC, USA, 23–27 January 2011. [Google Scholar]
  54. Malik, S.; Du, F.; Monroe, M.; Onukwugha, E.; Plaisant, C.; Shneiderman, B. Cohort comparison of event sequences with balanced integration of visual analytics and statistics. In Proceedings of the 20th International Conference on Intelligent User Interfaces; ACM: New York, NY, USA, 2015; pp. 38–49. [Google Scholar]
  55. Wongsuphasawat, K.; Gotz, D. Outflow: Visualizing patient flow by symptoms and outcome. In Proceedings of the IEEE VisWeek Workshop on Visual Analytics in Healthcare; Providence, RI, USA, 23 October 2011, American Medical Informatics Association: Bethesda, MD, USA, 2011; pp. 25–28. [Google Scholar]
  56. Klimov, D.; Shahar, Y.; Taieb-Maimon, M. Intelligent visualization and exploration of time-oriented data of multiple patients. Artif. Intell. Med. 2010, 49, 11–31. [Google Scholar] [CrossRef]
  57. Pohl, M.; Wiltner, S.; Rind, A.; Aigner, W.; Miksch, S.; Turic, T.; Drexler, F. Patient development at a glance: An evaluation of a medical data visualization. In Human-Computer Interaction—INTERACT 2011; Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., Winckler, M., Eds.; Springer: Berlin/Heidelberg, Germany, 2011; Volume 6949, pp. 292–299. ISBN 978-3-642-23767-6. [Google Scholar]
  58. HER Intelligence. 40% of Physicians See More EHR Challenges than Benefits. Available online: https://ehrintelligence.com/news/40-of-physicians-see-more-ehr-challenges-than-benefits (accessed on 18 December 2019).
Figure 1. Relationships among activities, tasks, and interactions. Top-down view: activity is made up of sub-activities, tasks, sub-tasks, and interactions. Bottom-up view: activity emerges over time, through performance of tasks and interactions. Visualizations are depicted as Vis and reactions as R x . Source: adapted from [7].
Figure 1. Relationships among activities, tasks, and interactions. Top-down view: activity is made up of sub-activities, tasks, sub-tasks, and interactions. Bottom-up view: activity emerges over time, through performance of tasks and interactions. Visualizations are depicted as Vis and reactions as R x . Source: adapted from [7].
Mti 04 00007 g001
Figure 2. Overview of the proposed activity and task analysis framework. The visual tasks are represented as blue and interactive tasks are represented as yellow.
Figure 2. Overview of the proposed activity and task analysis framework. The visual tasks are represented as blue and interactive tasks are represented as yellow.
Mti 04 00007 g002
Figure 3. Search results and how we selected the 24 articles that described 19 IVTs.
Figure 3. Search results and how we selected the 24 articles that described 19 IVTs.
Mti 04 00007 g003
Figure 4. Lifelines2: Interactive visualization tool for temporal categorical data. Source: Image courtesy of the University of Maryland Human–Computer Interaction Lab, http://hcil.umd.edu.
Figure 4. Lifelines2: Interactive visualization tool for temporal categorical data. Source: Image courtesy of the University of Maryland Human–Computer Interaction Lab, http://hcil.umd.edu.
Mti 04 00007 g004
Figure 5. Lifeflow: Interactive visualization tool that provides an overview of event sequences. Source: Image courtesy of the University of Maryland Human–Computer Interaction Lab, http://hcil.umd.edu.
Figure 5. Lifeflow: Interactive visualization tool that provides an overview of event sequences. Source: Image courtesy of the University of Maryland Human–Computer Interaction Lab, http://hcil.umd.edu.
Mti 04 00007 g005
Figure 6. Eventflow: Interactive visualization tool for analysis of event sequences for both point-based and interval events. Source: image courtesy of the University of Maryland Human–Computer Interaction Lab, http://hcil.umd.edu.
Figure 6. Eventflow: Interactive visualization tool for analysis of event sequences for both point-based and interval events. Source: image courtesy of the University of Maryland Human–Computer Interaction Lab, http://hcil.umd.edu.
Mti 04 00007 g006
Figure 7. Caregiver: Interactive visualization tool for visualization of categorical and numerical data. Source: Image courtesy of Dominique Brodbeck.
Figure 7. Caregiver: Interactive visualization tool for visualization of categorical and numerical data. Source: Image courtesy of Dominique Brodbeck.
Mti 04 00007 g007
Figure 8. CoCo: Interactive visualization tool for comparing cohorts of event sequences. Source: image courtesy of the University of Maryland Human–Computer Interaction Lab, http://hcil.umd.edu.
Figure 8. CoCo: Interactive visualization tool for comparing cohorts of event sequences. Source: image courtesy of the University of Maryland Human–Computer Interaction Lab, http://hcil.umd.edu.
Mti 04 00007 g008
Figure 9. Similan: interactive visualization tool for the exploration of similar records in the temporal categorical data. Source: image courtesy of the University of Maryland Human–Computer Interaction Lab, http://hcil.umd.edu.
Figure 9. Similan: interactive visualization tool for the exploration of similar records in the temporal categorical data. Source: image courtesy of the University of Maryland Human–Computer Interaction Lab, http://hcil.umd.edu.
Mti 04 00007 g009
Figure 10. IPBC: 3D visualization tool for analysis of numerical data from multiple hemodialysis sessions. Source: reprinted from Journal of Visual Languages & Computing, 14, Chittaro L, Combi C, Trapasso G, Data mining on temporal data: a visual approach and its clinical application to hemodialysis, 591-620, Copyright (2003), with permission from Elsevier.
Figure 10. IPBC: 3D visualization tool for analysis of numerical data from multiple hemodialysis sessions. Source: reprinted from Journal of Visual Languages & Computing, 14, Chittaro L, Combi C, Trapasso G, Data mining on temporal data: a visual approach and its clinical application to hemodialysis, 591-620, Copyright (2003), with permission from Elsevier.
Mti 04 00007 g010
Figure 11. TimeRider: Interactive visualization tool for pattern recognition in patient cohort data. Source: reprinted by permission from Springer Nature: Springer, Ergonomics and Health Aspects of Work with Computers, Visually Exploring Multivariate Trends in Patient Cohorts Using Animated Scatter Plots, Rind A, Aigner W, Miksch S, et al., copyright (2011).
Figure 11. TimeRider: Interactive visualization tool for pattern recognition in patient cohort data. Source: reprinted by permission from Springer Nature: Springer, Ergonomics and Health Aspects of Work with Computers, Visually Exploring Multivariate Trends in Patient Cohorts Using Animated Scatter Plots, Rind A, Aigner W, Miksch S, et al., copyright (2011).
Mti 04 00007 g011
Figure 12. VISITORS: Interactive visualization tool for the exploration of multiple patient records. (A) displays lists of patients. (B) displays a list of time intervals. (C) displays the data for a group of 58 patients over the current time interval. Panel 1 shows the white blood cell raw counts for the patients, while Panels 2 and 3 display the states of monthly distribution of platelet and haemoglobin in higher abstraction, respectively. Abstractions are encoded in medical ontologies displayed in panels (D). Source: reprinted from Journal of Artificial Intelligence in Medicine, 49, Klimov D, Shahar Y, Taieb-Maimon M, Intelligent visualization and exploration of time-oriented data of multiple patients, 11-31., copyright (2010), with permission from Elsevier.
Figure 12. VISITORS: Interactive visualization tool for the exploration of multiple patient records. (A) displays lists of patients. (B) displays a list of time intervals. (C) displays the data for a group of 58 patients over the current time interval. Panel 1 shows the white blood cell raw counts for the patients, while Panels 2 and 3 display the states of monthly distribution of platelet and haemoglobin in higher abstraction, respectively. Abstractions are encoded in medical ontologies displayed in panels (D). Source: reprinted from Journal of Artificial Intelligence in Medicine, 49, Klimov D, Shahar Y, Taieb-Maimon M, Intelligent visualization and exploration of time-oriented data of multiple patients, 11-31., copyright (2010), with permission from Elsevier.
Mti 04 00007 g012
Figure 13. MIVA: Interactive visualization tool to show the temporal change of numerical values where each variable is represented by an individual point plot. Source: image courtesy of Antony Faiola.
Figure 13. MIVA: Interactive visualization tool to show the temporal change of numerical values where each variable is represented by an individual point plot. Source: image courtesy of Antony Faiola.
Mti 04 00007 g013
Figure 14. Lifelines: interactive visualization tool that displays patient’s medical histories on a timeline. Source: image courtesy of the University of Maryland Human–Computer Interaction Lab, http://hcil.umd.edu.
Figure 14. Lifelines: interactive visualization tool that displays patient’s medical histories on a timeline. Source: image courtesy of the University of Maryland Human–Computer Interaction Lab, http://hcil.umd.edu.
Mti 04 00007 g014
Figure 15. VisuExplore: interactive visualization tool that displays patient data in various views on a timeline. Source: reprinted by permission from Springer Nature: Springer, Human–Computer Interaction, Patient Development at a Glance: An Evaluation of a Medical Data Visualization, Pohl M, Wiltner S, Rind A, et al., copyright (2011).
Figure 15. VisuExplore: interactive visualization tool that displays patient data in various views on a timeline. Source: reprinted by permission from Springer Nature: Springer, Human–Computer Interaction, Patient Development at a Glance: An Evaluation of a Medical Data Visualization, Pohl M, Wiltner S, Rind A, et al., copyright (2011).
Mti 04 00007 g015
Table 1. Shows the breakdown of the interactive and visual tasks.
Table 1. Shows the breakdown of the interactive and visual tasks.
TaskSub-tasks
InteractiveOrderingAggregating, Classifying, Identifying, Ranking
LocatingAggregating, Aligning, Classifying, Identifying, Ranking
QueryingClassifying, Identifying, Ranking,
OrganizingAggregating, Classifying, Identifying, Highlighting
SummarizingAggregating, Classifying, Identifying
ClusteringClassifying, Identifying, Ranking
ObservingAggregating, Aligning, Identifying, Ranking
VisualRecognizingAggregating, Aligning, Classifying, Identifying, Ranking
SpecifyingAggregating, Aligning, Classifying, Identifying, Highlighting, Ranking
DetectingClassifying, Identifying, Ranking
Table 2. Overview of the search terms used.
Table 2. Overview of the search terms used.
Terms Used
“Visualization*” +“Health Record*”
“Visualization*” + “Electronic Health Record*”
“Visualization*” + “EHR*”
“Visualization*” + “Electronic Patient Record*”
“Visualization*” + “Electronic Medical Record*”
“Visualization*” + “Patients Record*”
“Visualization*” + “Patient Record*”
“Visualization tool*” +“Health Record*”
“Visualization tool*” + “Electronic Health Record*”
“Visualization tool*” + “EHR*”
“Visualization tool*” + “Electronic Patient Record*”
“Visualization tool*” + “Electronic Medical Record*”
“Visualization tool*” + “Patients Record*”
“Visualization tool*” + “Patient Record*”
“Information visualization*” +“Health Record*”
“Information visualization*” + “Electronic Health Record*”
“Information visualization*” + “EHR*”
“Information visualization*” + “Electronic Patient Record*”
“Information visualization*” + “Electronic Medical Record*”
“Information visualization*” + “Patients Record*”
“Information visualization*” + “Patient Record*”
“Interactive visualization*” +“Health Record*”
“Interactive visualization*” + “Electronic Health Record*”
“Interactive visualization*” + “EHR*”
“Interactive visualization*” + “Electronic Patient Record*”
“Interactive visualization*” + “Electronic Medical Record*”
“Interactive visualization*” + “Patients Record*”
“Interactive visualization*” + “Patient Record*”
“Interactive visualization tool*” +“Health Record*”
“Interactive visualization tool*” + “Electronic Health Record*”
“Interactive visualization tool*” + “EHR*”
“Interactive visualization tool*” + “Electronic Patient Record*”
“Interactive visualization tool*” + “Electronic Medical Record*”
“Interactive visualization tool*” + “Patients Record*”
“Interactive visualization tool*” + “Patient Record*”
“Visualization system*” + “Health Record*”
“Visualization system*” + “Electronic Health Record*”
“Visualization system*” + “EHR*”
“Visualization system*” + “Electronic Patient Record*”
“Visualization system*” + “Electronic Medical Record*”
“Visualization system*” + “Patients Record*”
“Visualization system*” + “Patient Record*”
“Information visualization system*” + “Health Record*”
“Information visualization system*” + “Electronic Health Record*”
“Information visualization system*” + “EHR*”
“Information visualization system*” + “Electronic Patient Record*”
“Information visualization system*” + “Electronic Medical Record*”
“Information visualization system*” + “Patients Record*”
“Information visualization system*” + “Patient Record*”
Table 3. Evaluation summary of the 19 existing tools based on the proposed framework.
Table 3. Evaluation summary of the 19 existing tools based on the proposed framework.
IVTs InterpretingPredictingMonitoring
Population-based toolsLifelines 2Sub-activitydiscovering, understanding, noinvestigating
Taskslocating, observing,
ordering
n/alocating, observing,
ordering
Sub-tasksaggregating, identifying,
ranking
n/aaggregating, identifying,
ranking
LifeflowSub-activityexploring, overviewingnoanalyzing
Tasksordering, recognizingn/aordering, recognizing
Sub-tasksaggregating, classifying, identifyingn/aaggregating, classifying, identifying
EventflowSub-activityexploring, overviewinglearninginvestigating
Tasksrecognizing, summarizingspecifying, summarizingdetecting
Sub-tasksaggregating, classifying, identifyingaggregating, classifying, identifying aggregating, classifying, identifying
SimilanSub-activitydiscovering, exploringdiscoveringno
Tasksdetecting, recognizingordering, queryingn/a
Sub-tasksidentifying, classifying, rankingidentifying, classifying, rankingn/a
CoCoSub-activityexploringlearninginvestigating
Tasksdetectingdetectingdetecting
Sub-tasksclassifying, identifying, rankingidentifying, classifying, rankingidentifying, classifying, ranking
OutflowSub-activityexploring, overviewingdiscoveringno
Tasksdetecting, specifying,
summarizing
detecting, specifying,
summarizing
n/a
Sub-tasksaggregating, classifying, identifyingaggregating, classifying, identifyingn/a
CaregiverSub-activitydiscoveringlearningn/a
Tasksspecifyingclustering, specifyingn/a
Sub-tasksclassifying, identifying, rankingclassifying, identifying, rankingn/a
Gravi++Sub-activitydiscovering, exploringnoinvestigating
Tasksrecognizing, specifyingn/arecognizing, specifying
Sub-tasksclassifying, identifyingn/aclassifying, identifying
IPBCSub-activityexploringnoevaluating
Tasksrecognizing, specifyingn/arecognizing, specifying
Sub-tasksclassifying, identifying, rankingn/aclassifying, identifying, ranking
Pattern FinderSub-activitydiscovering, exploringnono
Tasksspecifying, queryingn/an/a
Sub-tasksidentifying, rankingn/an/a
PrimaSub-activityexploringnono
Tasksrecognizing, specifyingn/an/a
Sub-tasksaggregating, rankingn/an/a
TimeriderSub-activitydetecting, overviewingnoinvestigating
Tasksclustering, recognizing, specifyingn/arecognizing
Sub-tasksaligning, identifyingn/an/a
VISITORSSub-activityexploringnoinvestigating
Taskslocating, observing, specifyingn/alocating, observing, recognizing, specifying
Sub-tasksaggregating, aligning, classifyingn/aaggregating, aligning, classifying, identifying
WBIVSSub-activitydiscovering, exploringnoinvestigating
Tasksorganizing, specifying n/aorganizing, specifying
Sub-tasksclassifying, highlighting, identifyingn/aclassifying, highlighting, identifying
Single-Patient ToolsMidgardSub-activityexploringnono
Tasksrecognizingn/an/a
Sub-tasksclassifying, identifying
MIVASub-activityexploringnono
Tasksrecognizing, specifyingn/an/a
Sub-tasksclassifying, identifying
VIE-VisuSub-activityoverviewingnoevaluating
Tasksrecognizingn/aspecifying
Sub-taskaggregating, classifyingn/aaggregating, classifying
LifelinesSub-activityunderstandingnoinvestigating
Tasksrecognizing, specifyingn/aoutlining, summarizing
Sub-tasksaggregating, classifying, identifyingn/aaggregating, classifying, identifying
VisuExploreSub-activityexploringnoevaluating
Tasksspecifyingn/arecognizing
Sub-tasksaligning, identifyingn/aidentifying

Share and Cite

MDPI and ACS Style

Rostamzadeh, N.; Abdullah, S.S.; Sedig, K. Data-Driven Activities Involving Electronic Health Records: An Activity and Task Analysis Framework for Interactive Visualization Tools. Multimodal Technol. Interact. 2020, 4, 7. https://doi.org/10.3390/mti4010007

AMA Style

Rostamzadeh N, Abdullah SS, Sedig K. Data-Driven Activities Involving Electronic Health Records: An Activity and Task Analysis Framework for Interactive Visualization Tools. Multimodal Technologies and Interaction. 2020; 4(1):7. https://doi.org/10.3390/mti4010007

Chicago/Turabian Style

Rostamzadeh, Neda, Sheikh S. Abdullah, and Kamran Sedig. 2020. "Data-Driven Activities Involving Electronic Health Records: An Activity and Task Analysis Framework for Interactive Visualization Tools" Multimodal Technologies and Interaction 4, no. 1: 7. https://doi.org/10.3390/mti4010007

APA Style

Rostamzadeh, N., Abdullah, S. S., & Sedig, K. (2020). Data-Driven Activities Involving Electronic Health Records: An Activity and Task Analysis Framework for Interactive Visualization Tools. Multimodal Technologies and Interaction, 4(1), 7. https://doi.org/10.3390/mti4010007

Article Metrics

Back to TopTop