Assessment of Eye Fatigue Caused by 3D Displays Based on Multimodal Measurements
<p>Experimental procedures used in our research.</p> ">
<p>Proposed system for the assessment of eye fatigue. (<b>a</b>) Proposed experimental device; (<b>b</b>) Example of experimental environment.</p> ">
<p>International 10–20 electrode placement system.</p> ">
<p>Example of sub-block-based template matching algorithm.</p> ">
<p>Example of measurement of eye blinking. (<b>a</b>) Open eyes; (<b>b</b>) Closed eyes.</p> ">
<p>Example of detection of face and nose. (<b>a</b>) The detected regions of face and nose in the web-camera image; (<b>b</b>) The defined regions of the face and nose in the thermal camera image.</p> ">
<p>Example of the measurement region for variation of FT.</p> ">
<p>Experimental procedure.</p> ">
<p>Comparison of SE scores before and after watching the 3D display.</p> ">
Abstract
: With the development of 3D displays, user's eye fatigue has been an important issue when viewing these displays. There have been previous studies conducted on eye fatigue related to 3D display use, however, most of these have employed a limited number of modalities for measurements, such as electroencephalograms (EEGs), biomedical signals, and eye responses. In this paper, we propose a new assessment of eye fatigue related to 3D display use based on multimodal measurements. compared to previous works Our research is novel in the following four ways: first, to enhance the accuracy of assessment of eye fatigue, we measure EEG signals, eye blinking rate (BR), facial temperature (FT), and a subjective evaluation (SE) score before and after a user watches a 3D display; second, in order to accurately measure BR in a manner that is convenient for the user, we implement a remote gaze-tracking system using a high speed (mega-pixel) camera that measures eye blinks of both eyes; thirdly, changes in the FT are measured using a remote thermal camera, which can enhance the measurement of eye fatigue, and fourth, we perform various statistical analyses to evaluate the correlation between the EEG signal, eye BR, FT, and the SE score based on the T-test, correlation matrix, and effect size. Results show that the correlation of the SE with other data (FT, BR, and EEG) is the highest, while those of the FT, BR, and EEG with other data are second, third, and fourth highest, respectively.1. Introduction
The recent rapid development of 3D displays has resulted in the need for various types of 3D devices such as anaglyphs, passive and active glasses, and non-glasses device types. However, with the increasing prevalence of 3D displays, user eye fatigue has become an important health issue when one is viewing a 3D display. Eye fatigue caused by fatigue, eyestrain, and dizziness, etc. is usually induced by the discordance between accommodation and convergence when viewing 3D displays [1]. Many previous studies have therefore focused on measuring eye fatigue, which is classified into two categories: Single modality-based and multiple modality-based methods.
Single modality-based methods measure eye fatigue using a single modality such as the image features captured by a camera [2–5], and they use only bio-signals [6–12]. In previous studies, eye blinks were used to measure eye fatigue, and they were observed from images captured by a camera [2,4,5]. Other studies showed that visual fatigue could be measured by analyzing eye movements and eye blinks [3]. In one study [5], the authors quantitatively measured the eyestrain for a 3D display, and their measurements were based on the eye blinking rate (BR). In this case, they calculated the levels of various factors considering eye foveation modal and edge information.
In general, methods that involve the use of image features obtained by a camera have the advantage of causing less discomfort to the participants during experiments than methods involving the use of bio-signals, which require the attachment of cumbersome sensors to one's body in order to measure the bio-signals. In addition, bio-signals may contain noises caused by muscle movements. However, the speed with which conventional cameras (such as web-cams) can acquire images is usually lower than what is possible when acquiring bio-signals due to the larger volume of data to be transmitted from the camera to the computer.
In previous studies, eye fatigue has therefore been measured by analyzing eye movements using electrooculography (EOG) signals [6]. Other studies showed that electroencephalograms (EEGs) that are based on an event-related potential (ERP) may be used to measure visual fatigue [7,9]. Chen et al. showed that the alpha, beta, and delta bands of EEG signals can be used to measure visual fatigue for 3D TV [8]. In another previous study, visual fatigue was evaluated by analyzing the user's electrocardiography (ECG) signals [10]. Qian et al. showed that the blink signals extracted from EEG data may indicate eye fatigue [11]. Mun et al. proposed a method for identifying the steady-state visually evoked potential (SSVEP) and ERP, which are linked to 3D cognitive fatigue for stereoscopic 3D displays using EEG signals [12]. In a previous study, they also measured eye fatigue when viewing 3D displays, and used ECGs and subjective evaluation (SE) [13].
With respect to multiple modality-based methods, eye fatigue has been measured using multiple sensors simultaneously [14,15]. Kim et al. proposed a method for measuring eye fatigue on a 3D display using ECG sensors, the galvanic skin response (GSR), and the skin temperature (SKT) with SE [14]. Eye fatigue when viewing 3D displays has also been measured using the power of the beta bands of EEGs and BRs considering a Bayesian network [15]. These methods have the advantage of usually being more accurate than the methods involving a single modality. However, the attachment of multiple sensors may cause much discomfort to the user, which can lead to incorrect eye fatigue measurements [14]. In addition, the accuracy enhancement of measurement of eye fatigue just by two modalities of EEG and BR is limited [15].
To overcome these problems of previous works, we propose a method of assessing eye fatigue caused by 3D displays based on multimodal measurements including EEG signals, eye BR, and facial temperature (FT) with SE both before and after watching a 3D display. To accurately measure the BR in a manner that is more convenient to the user, we implement a remote gaze-tracking system using a high-speed mega-pixel camera that can measure the BR of both eyes. Changes in the FT are measured using a remote thermal camera that can enhance the measurement of eye fatigue. Table 1 shows summarized comparisons of previous and proposed methods for measuring eye fatigue.
The remainder of this paper is structured as follows: In Section 2, we describe the proposed system and the feature analysis. Then, the experimental setup and results are described in Section 3. Finally, we present the conclusions in Section 4.
2. Proposed Method for Measuring Eye Fatigue
2.1. Experimental Procedure for Proposed System and Method
Figure 1 shows the experimental procedures used in our research. First, a SE was carried out in order to check the subject's condition before watching the 3D display. In order to measure the natural blink of the subjects, the eye BR was measured for 1 min. before the user was made to watch the 3D display. Also, prior to watching the 3D display, the user's EEG data and FT were measured for 5 min. with eyes closed in order to minimize visual stimuli. The subjects then watched the 3D display for 30 min. The eye BR was measured for the last 1 min of watching in order to compare the variations in the eye BR immediately before and after watching the 3D display. After watching the display, the EEG data and FT were again measured for 5 min. while the user's eyes were closed in order to minimize visual stimuli, and these measurements were used to compare the variations in the EEG and FT data. A SE was carried out to check the subject's condition after watching the 3D display.
As shown in Figure 2a, a user wears both a headset-based EEG device and active shutter glasses [16] for the experiment. As shown in Figure 2b, the user wearing shutter glasses and the EEG device watches 3D content shown on a 60-inch smart TV with a display resolution of 1920 × 1080 pixels. In order to assess eye fatigue, we measured the EEG signals, eye BR, and FT using the EEG device, high speed camera [17], and thermal camera [18], respectively.
We used a commercial EEG device with a low-cost Emotiv EPOC headset that consists of 16 electrodes, including two reference nodes for measuring EEG signals [19], as shown in Figure 2. The placement of electrodes complies with the international 10-20 system like previous researches [20,21], as shown in Figure 3. EEG data were processed using a built-in digital 5th-order Sinc filter. The sampling rate of the Emotiv EPOC system was 128 Hz (128 samples/s) [20,22].
A high-speed camera of 4 mega pixels was used to measure the eye BR and to acquire both images of both eyes. The camera acquires images with a resolution of 2048 × 2048 pixels at a speed of 150 fps [17]. However, the actual acquisition speed of images is much lower than 150 fps because of the larger amount of data to be transmitted from the camera to the computer, and then to be written to the hard disk. In order to improve the image acquisition speed, a cropped image with 2048 × 512 pixels (including both eyes) is captured. To enhance the accuracy of the detection of the pupil region, a near infrared (NIR) illumination of 850 nm was used, as shown in Figure 2b, which does not dazzle the user's eye [23].
As shown in Figure 2, a thermal camera was used to measure the FT based on changes in the eye fatigue. The captured thermal image has a resolution of 320 × 240 pixels with 14 bits, and its capturing speed ranges from 50 to 60 fps. However, the actual capture speed of images is much lower than 50 fps because of the large amount of data to be transmitted from the camera to the computer, and then to be written by the hard disk. The temperature of the thermal camera ranges from −20 °C to 100 °C, and its accuracy is ±1 °C or ±1% [18]. As shown in Figure 2b, a commercial web-camera (Webcam C600) [24] was attached close to the thermal camera in order to locate the user's face and nose. Because it is difficult to find accurate positions of the face and nose in the thermal image, we used the positions of the face and nose in the web-camera image to define the areas of the face and nose in the thermal image. The web-camera captures an 800 × 600 pixel 24-bit image at a speed of 30 fps using the NIR illuminator. The NIR illuminator is used to detect the face and nose in a manner that is robust to illumination variations.
As shown in Figure 2b, the distance between the subject and the 3D TV is set at about 250 cm considering the guidelines for watching 3D TV [25]. The distance between the subject and the high-speed camera is about 80 cm, while the distance between the subject and the thermal camera is about 100 cm, and the distance between the subject and the NIR illuminator is about 60 cm.
2.2. Analysis of EEG Data for Measuring Eye Fatigue
To measure eye fatigue, we analyzed the beta band (13–30 Hz) of the EEG data because the power of EEG signals in the beta band is usually stronger when viewing 3D displays [7,10]. The EEG signals are determined based on the voltage levels measured from each electrode. The measured EEG signals are normalized by adjusting the DC level, and a further normalization of the min-max scale was performed to represent the EEG magnitude within the range −1 to 1. Then, the power of the beta band was calculated by performing a Fourier transform with a window length of 128 samples.
2.3. Analysis of BR for Measuring Eye Fatigue
Eye BR has previously been used in the measurement of eye fatigue. A higher eye fatigue is related to a higher eye BR [2,3,26]. Therefore, in our research, the eye BR was measured using a high-speed camera. In order to accurately measure the eye BR in a manner that is convenient to the user, we implemented a remote gaze-tracking system using a high-speed mega-pixel camera that can measure the eye blinks of both eyes.
With the captured image, the region of the corneal specular reflection is located using image binarization. Two areas of the eyes are detected based on the detected region of the corneal specular reflection using sub-block based template matching, which is based on the scheme of integral images in order to reduce the computational complexity [27]. The means of each sub-block (R0 ∼ R8) are calculated as shown in Figure 4, where R0 and R1 ∼ R8 represent the candidate pupil regions and its neighboring ones, respectively. The mean of the gray value of the pupil region is usually lower than those of the other regions. Therefore, the mean of the gray value of Ro in the candidate pupil region is compared with the means of the gray values of R1 ∼ R8 in the neighboring regions, as shown in Figure 4. The sum of the difference of the mean values of R0 and the neighboring regions (R1 ∼ R8) is then calculated. In order to find the pupil position for which the sum of the difference value is a maximum, the 3 × 3 mask in Figure 4 moves by the overlapping of one pixel. Within the area detected by the sub-block-based template matching, the boundary of the pupil is located using the ellipse fitting method, as shown in Figure 5a. For closed eye, as shown in Figure 5b, the ellipse fitting method fails, whereas for opened eyes, the ellipse fitting method is successful. We are therefore able to differentiate between images with both open and closed eyes, and the eye BR is counted based on the number of images with closed eyes during a one-minute interval.
2.4. Analysis of Variation of FT for Measuring Eye Fatigue
Changes in the FT were measured using a remote thermal camera, which can enhance the measurement of eye fatigue. A web-camera [24] is attached beside the thermal camera because it is difficult to detect the facial features in the thermal image. Therefore, with the detected areas of the face and nose in the web-camera image, the areas of the face and nose are defined considering the image disparity between the web-camera and thermal camera. The degree of image disparity is obtained in advance by calibrating the camera.
The web-camera captures the image with the NIR illuminator. The reason for using the NIR illuminator is to detect the face and nose in a manner that is robust to variations in illumination. As shown in Figure 6a, the detection of the face region is achieved by the adaptive boosting (Adaboost) method [28], and the center of both nostrils based on the detected face region is located using binarization. The regions of the face and center of the nose are defined in the thermal camera image based on the regions of the face and center of the nose in the web-camera image, as shown in Figure 6b.
Based on the detected center of the nose, variations in the cheek regions (30 × 30) are analyzed for the measurement of eye fatigue, as shown in Figure 7.
3. Experimental Setup and Results
The data acquisition for the experiments was done using two desktop computers and a laptop computer. All of the EEG signal, eye image, and thermal image data were acquired simultaneously. The desktop computer that was used to acquire images of both eyes with a high-speed camera was equipped with a 3.07 GHz CPU (Intel (R) Core (TM) i7 CPU 950) and 6 GB RAM. The desktop computer used to acquire the EEG signals using the Emotiv EPOC headset was equipped with a 2.33 GHz CPU (Intel (R) Core (TM) 2 Quad CPU Q8200) and 4 GB RAM. The laptop computer used to acquire the web-camera and thermal camera images was equipped with a 2.50 GHz CPU (Intel (R) Core (TM) i5-2520M) and 4 GB RAM.
The proposed algorithm for the measurement of eye fatigue was implemented as a C++ program using the Microsoft Foundation Class (MFC) and OpenCV library. A total of 15 subjects participated in the experiments (average age: 26.89, standard deviation: 1.96). The numbers of male and female are 12 and 3, respectively. We already obtained the written and informed agreements from each participant of our experiments. The luminance of the room was 321 lux., and the highest brightness of the display was 99.546 cd/m2. We used the sample video entitled “Summer in Heidelberg” for our experiments, which is mostly composed of landscape scenes as shown in Figure 2b, and we already obtained the permission from the video copyright owner [5,29]. In order to measure the rate of natural eye blink in the last 1 min of Figure 8, any artificial sign, indication or instruction for user's alertness was not given to each participant. There was no drowsy or dozing person in the experiments.
The experimental procedure is presented in Figure 8. In order to enhance the accuracy of the eye fatigue assessment, changes in the EEG signal, eye BR, and FT were measured with SE both before and after watching the 3D display. In previous research [8], they measured the variations of EEG data caused by eye fatigue on 3D TV by using 16 electrodes on the whole area of head. They also measured the EEG data while a user is closing his eyes before and after watching 3D TV. We refer to this paper for our experimental design of measuring EEG signal using multiple electrodes while a user is closing his eyes. To compare the subject's state before and after watching the 3D display, SE was performed using a questionnaire form. The SE before and after watching the 3D display was performed based on the six questions in Table 2 using a 10-point scale (1: Not at all ∼ 10: Yes, very much). These questions were developed based on previous studies [5,30]. The average and standard deviation of the score obtained by SE of the 15 subjects before and after watching the 3D display are presented in Figure 9 and Table 3.
The average SE score after watching the 3D display is observe to be higher than that before watching the 3D display, as indicated by Figure 9 and Table 3. The statistical analysis was performed using an independent two-sample T-test [31], which is widely used as a statistical hypothesis test. The calculated p-value is 0.0001, which is less than the confidence level of 99% (0.01). The null-hypothesis (the two average scores of SEs before and after watching the 3D display are the same) is rejected, and the two scores of SE before and after watching the 3D display are significantly different [31].
Figure 10 and Table 4 show the measured amplitudes of the beta band of each electrode of the EEG signal before and after watching the 3D display. As shown in Figure 10, the amplitude of the beta band of each electrode is stronger after watching the 3D display than before watching the 3D display. The electrode that shows a significant difference before and after watching the 3D display was selected using the T-test, as shown in Table 4. The P7 of the lowest p-value (0.0795) of all the electrodes was selected, as shown in Table 4. Although the p-value of the P7 node is lowest as 0.0795, it does not show the significant difference with the confidence level of 95% or 99% because the p-value is larger than the thresholds for 95% (0.05) and 99% (0.01), respectively. This is because EEG signals contain noise that is caused by movements of the facial muscle. Nevertheless, we used the EEG signal of P7 node because its significant difference is relatively higher than those of other nodes, and the level of inter-correlations between each measurements of EEG, BR, FT, and SE score are measured as shown in Tables 8 and 9.
In total 16 electrodes are used in our EEG measurement device as shown in Figure 3, and the strength of attachment (on head) of some nodes of backside of head (P7, P8, O1, and O2 of Figure 3) is relatively lower than that of others due to the individual variations of head shape. Among them, the strength of attachment of P8 node is lowest, and the consequent variation of movement of P8 node on head is larger than others during experiment, which causes the large standard deviation value of Figure 10.
Figure 11 and Table 5 show the measured eye BRs before watching the 3D display and during the last 1 min of watching the 3D display. As shown in Figure 11, the BR increased in the last 1 min of watching the 3D display compared to the value before watching the 3D display. The p-value is 0.2876, which is larger than confidence levels of 99% (0.01) or 95% (0.05). It is therefore difficult to say that there is a significant difference in the BR before watching the 3D display and in the last 1 min of watching the 3D display. The BR increased by 21.07% (100 × (22.6 – 18.667)/18.667) in the last 1 min of watching the 3D display relative to the value before watching the 3D display, as shown in Table 5.
As shown in Figure 12 and Table 6, the amplitude of the FT decreased after watching the 3D display compared to the value before watching the 3D display. The p-value is 0.00089, which is much less than the confidence level of 99% (0.01). We can therefore deduce that there is a significant difference in the FT amplitudes before and after watching the 3D display.
For the next analysis, we measured the difference between the amount of data before and after watching the 3D display using the effect size in descriptive statistics. In statistics, the effect size has been used to represent the power of an observed phenomenon, and it is generally accepted as a descriptive statistic [5,32].
Based on previous research [33], we can define Cohen's d values of 0.2, 0.5, and 0.8 as small, medium, and large, respectively. In Table 7 and Figure 13, Cohen's d represents the difference between two means divided by the standard deviation of the data. By calculating Cohen's d, effect sizes of 0.2 to 0.3, around 0.5, and 0.8 to infinity may be considered as having “small,” “medium,” and “large” effects, respectively [5,32]. For example, in Table 7, Cohen's d of the EEG data before and after watching the 3D display is 0.6644, which is closer to 0.8 (large effect) than to either 0.2 (small effect) or to 0.5 (medium effect). Therefore, we deduce that the difference in the EEG data before and watching the 3D display has a large effect size. It is often the case that 0.5 is used among the values of around 0.5, and we also used 0.5 for the medium effect. Therefore, the disparity (0.1356) between 0.8 and 0.6644 is smaller than that (0.1644) between 0.6644 and 0.5, and we regard this case (0.6644) as the large effect size as shown in Table 7. As another example, Cohen's d for the BR before and in the last 1 min of watching the 3D display is 0.3958, which is closer to 0.5 than to either 0.2 or 0.8. Thus, we deduce that the difference in the BR before and in the last 1 min of watching the 3D display has a medium effect size. In the cases of the FT and SE, the effect sizes are large.
Based on the p-value and Cohen's d value, we find that the difference between the SEs before and after watching the 3D display is largest. In addition, the differences between FT, EEG, and BR are second, third, and fourth largest, respectively.
For the next analysis, we measured the correlation between two sets of data from among EEG, BR, FT, and SE data. To do this, we calculated the gradient, R2, and correlation value between two sets of data, as shown in Table 8. With the data distributed in 2-dimensional (2D) space, an optimal fitted line is obtained by linear regression, and the gradient and R2 value can be calculated. R2 represents the confidence level of the predicted regression line. If the regression line fits the data more reliably, the R2 value increases [5].
The correlation value ranges from −1 to 1. Correlation values of −1 and 1 represent negative and positive relationships, respectively. If the correlation value is 0, it is almost uncorrelated. Because only the FT is reduced after watching the 3D display, as shown in Figure 12, whereas the SE, EEG, and BR increase as shown in Figures 9, 10 and 11, the FT data were multiplied by −1 in order to make them consistent. As shown in Table 8 and Figure 14, the absolute correlation value and R2 value between the BR and SE are greatest, and that between the BR and FT is lowest in the measurement of eye fatigue with a 3D display.
In order to simplify the comparison, we show the correlation matrix (based on the correlation value of Table 8) in Table 9. The correlation matrix of four sets of measured data (EEG, BR, FT, and SE) before and after (or in the last 1 min) watching the 3D display was calculated to enable the analysis of the statistical relationships [34].
As shown in Table 9, the relationships between the EEG and BR, EEG and FT, and EEG and SE were negatively related. Therefore, we find that the beta band of EEG data shows the inconsistent results with other data. This is because EEG signals contain noise that is caused by movements of the facial muscle. The relationships between the BR and FT, BR and SE, and FT and SE were positively related. As shown in Table 9, the absolute correlation value between BR and SE is highest, and that between BR and FT is lowest for the measurement of eye fatigue on the 3D display.
In order to quantitatively assess the individual correlations and the consistency of one data with others, we calculate the summed value of all the correlation values excluding the auto-correlation value of 1 (for example, correlation value between EEG and EEG). As shown in Table 9 and Figure 15, the correlation of SE with other data is highest based on the largest sum of 1.0092. In addition, those of the FT, BR, and EEG with other data are second, third, and fourth highest, respectively.
Previous researches showed that the level of eye fatigue increases after watching 3D displays [8–14], and there is no ground-truth value of the level of eye fatigue. Therefore, we measured the differences of four measurements of EEG, FT, BR, and SE score before and after (in the last one minute in case of BR) watching 3D display as the changed level of eye fatigue as shown in Tables 3, 4, 5 and 6 and Figures 9, 10, 11 and 12. However, because the units of each measurement value are not identical, we cannot regard the absolute difference value of each measurement as the difference of the level of eye fatigue. Therefore, we measured the difference between the amount of data before and after (or in the last one minute) watching the 3D display using the effect size in descriptive statistics as shown in Table 7. Based on Table 7 and Figure 13, we find that the difference between the SEs before and after watching the 3D display is largest. In addition, the differences between FT, EEG, and BR are second, third, and fourth largest, respectively.
However, if we measure the sum of correlation value of each measurement with others as shown in Table 9 and Figure 15, the correlation of SE with other measurement data is highest. In addition, those of the FT, BR, and EEG with other data are second, third, and fourth highest, respectively. From these, we find that the difference between the BRs before and in the last one minute for watching the 3D display is smaller than that between the EEGs as shown in Table 7, but the BRs are more credible than the EEGs considering the correlation with other data of Table 9. By conclusion, we find that the order of credibility of measurements for the eye fatigue on 3D display is SE, FT, BR, and EEG, respectively.
4. Conclusions
In this paper, we propose a novel assessment of eye fatigue caused by 3D displays based on a multimodal measurement method. In order to enhance the accuracy with which we assess the eye fatigue, we measured the changes in the EEG signal, eye BR, FT, and SE before and after (or in the last 1 min) of watching a 3D display. To accurately measure the eye BR in a manner that is convenient to the user, we implemented a remote gaze-tracking system using a high-speed mega-pixel camera that can measure the eye blinks of both eyes. The change in the FT is measured using a remote thermal camera that can enhance the measurement of eye fatigue. Experimental results showed that the correlation between BR and the SE score is highest, while that between BR and FT is lowest for the measurement of eye fatigue with a 3D display. In addition, the sum of all the correlation values of SE with other data (FT, BR, and EEG) is highest, and those of FT, BR, and EEG with other data are second, third, and fourth highest, respectively.
Because we aim at collecting accurate data of EEG, BR, and FT for the accurate measurement of the level of eye fatigue by non-intrusive way to user, our system is somewhat complicated as shown in Figure 2. In future work, we plan to simplify our capturing system. In addition, we would perform a more accurate evaluation of eye fatigue by combining the EEG signal, eye BR, FT, and SE based on approaches such as fuzzy rule or principal component analysis (PCA).
Acknowledgments
This work was partly supported by the SW R&D program of MSIP/IITP [10047146, Real-time Crime Prediction and Prevention System based on Entropy-Filtering Predictive Analytics of Integrated Information such as Crime-Inducing Environment, Behavior Pattern, and Psychological Information], and in part by the Public Welfare & Safety Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT & Future Planning (NRF-2014-0020807).
Author Contributions
Jae Won Bang and Kang Ryoung Park designed the overall system for measuring eye fatigue. In addition, they wrote and revised the paper. Hwan Heo and Jong-Suk Choi implemented the measurement system of facial temperature by using the remote thermal camera.
Conflicts of Interest
The authors declare no conflict of interest.
References and Notes
- Urvoy, M.; Barkowsky, M.; Callet, P.L. How visual fatigue and discomfort impact 3D-TV quality of experience: A comprehensive review of technological, psychophysical, and psychological factors. Ann. Telecommun. 2013, 68, 641–655. [Google Scholar]
- Kim, D.; Choi, S.; Park, S.; Sohn, K. Stereoscopic Visual Fatigue Measurement Based on Fusional Response Curve and Eye-Blinks. Proceedings of International Conference on Digital Signal Processing, Corfu, Greece, 6–8 July 2011; pp. 1–6.
- Kim, D.; Choi, S.; Choi, J.; Shin, H.; Sohn, K. Visual fatigue monitoring system based on eye-movement and eye-blink detection. Proc. SPIE 2011, 7863, 786303-1–786303-8. [Google Scholar]
- Lee, E.C.; Heo, H.; Park, K.R. The comparative measurements of eyestrain caused by 2D and 3D displays. IEEE Trans. Consum. Electron. 2010, 56, 1677–1683. [Google Scholar]
- Heo, H.; Lee, W.O.; Shin, K.Y.; Park, K.R. Quantitative Measurement of Eyestrain on 3D Stereoscopic Display Considering the Eye Foveation Model and Edge Information. Sensors 2014, 14, 8577–8604. [Google Scholar]
- Yu, J.-H.; Lee, B.-H.; Kim, D.-H. EOG Based Eye Movement Measure of Visual Fatigue Caused by 2D and 3D Displays. Proceedings of the IEEE-EMBS International Conference on Biomedical and Health Informatics, Hong Kong and Shenzhen, China, 2–7 January 2012; pp. 305–308.
- Li, H.-C.O.; Seo, J.; Kham, K.; Lee, S. Measurement of 3D Visual Fatigue Using Event-Related Potential (ERP): 3D Oddball Paradigm. Proceedings of 3DTV Conference: The True Vision-Capture, Transmission and Display of 3D Video, Istanbul, Turkey, 28–30 May 2008; pp. 213–216.
- Chen, C.; Li, K.; Wu, Q.; Wang, H.; Qian, Z.; Sudlow, G. EEG-based detection and evaluation of fatigue caused by watching 3DTV. Displays 2013, 34, 81–88. [Google Scholar]
- Cho, H.; Kang, M.-K.; Yoon, K.-J.; Jun, S.C. Feasibility Study for Visual Discomfort Assessment on Stereo Images Using EEG. Proceedings of International Conference on 3D Imaging, Liège, Belgium, 3–5 December 2012; pp. 1–6.
- Park, S.J.; Oh, S.B.; Subramaniyam, M.; Lim, H.K. Human Impact Assessment of Watching 3D Television by Electrocardiogram and Subjective Evaluation. Proceedings of XX IMEKO World Congress—Metrology for Green Growth, Busan, Korea, 9–14 September 2012; pp. 1–4.
- Qian, Z.; Wang, X.; Lan, C.; Li, W. Analysis of Fatigue with 3D TV Based on EEG. Proceedings of International Conference on Orange Technologies, Tainan, Taiwan, 12–16 March 2013; pp. 306–309.
- Mun, S.; Park, M.-C.; Park, S.; Whang, M. SSVEP and ERP measurement of cognitive fatigue caused by stereoscopic 3D. Neurosci. Lett. 2012, 525, 89–94. [Google Scholar]
- Park, S.; Won, M.J.; Mun, S.; Lee, E.C.; Whang, M. Does visual fatigue from 3D displays affect autonomic regulation and heart rhythm? Int. J. Psychophysiol. 2014, 92, 42–48. [Google Scholar]
- Kim, C.J.; Park, S.; Won, M.J.; Whang, M.; Lee, E.C. Autonomic nervous system responses can reveal visual fatigue induced by 3D displays. Sensors 2013, 13, 13054–13062. [Google Scholar]
- Yuan, Z.; Kim, J.H.; Cho, J.D. Visual fatigue measurement model in stereoscopy based on bayesian network. Opt. Eng. 2013, 52, 083110. [Google Scholar]
- 3D Active Glasses. Available online: http://www.samsung.com/hk_en/consumer/tv-av/televisions/tv-accessories/SSG-3700CR/XS?subsubtype=3d-glasses (accessed on 14 July 2014).
- Gazelle. Available online: http://www.ptgrey.com/products/gazelle/gazelle_camera_link.asp (accessed on 14 July 2014).
- ICI 7320 Pro Specifications. Available online: http://www.infraredcamerasinc.com/Thermal-Cameras/Fix-Mounted-Thermal-Cameras/ICI7320_Pro_fix-mounted_thermal_camera.html (accessed on 14 July 2014).
- EPOC Neuroheadset. Available online: http://www.emotiv.com/store/hardware/epoc-bci/epoc-neuroheadset/ (accessed on 14 July 2014).
- Bang, J.W.; Choi, J.-S.; Park, K.R. Noise reduction in brainwaves by using both EEG signals and frontal viewing camera images. Sensors 2013, 13, 6272–6294. [Google Scholar]
- Choi, J.-S.; Bang, J.W.; Park, K.R.; Whang, M. Enhanced perception of user intention by combining EEG and gaze-tracking for brain-computer interfaces (BCIs). Sensors 2013, 13, 3454–3472. [Google Scholar]
- Emotiv SDK. Available online: http://innovatec.co.jp/content/etc/ResearchEditionSDK.pdf (accessed on 14 July 2014).
- SFH 4550. Available online: http://www.jlab.org/accel/inj_group/laser2001/pockels_files/pockels_switch_notebook_files/SFH4550.pdf (accessed on 14 July 2014).
- Webcam C600. Available online: http://www.logitech.com/en-us/support/5869 (accessed on 14 July 2014).
- Li, H.-C.O. 3DTV Broadcasting Safety Guideline. Available online: http://www.tta.or.kr/data/reportDown.jsp?news_num=2772 (accessed on 3 September 2014).
- Stern, J.A.; Boyer, D.; Schroeder, D. Blink rate: A possible measure of fatigue. Hum. Factors 1994, 36, 285–297. [Google Scholar]
- Shin, K.Y.; Kim, Y.G.; Park, K.R. Enhanced iris recognition method based on multi-unit iris images. Opt. Eng 2013, 52, 047201-1–047201-11. [Google Scholar]
- Viola, P.; Jones, M.J. Robust Real-Time Face Detection. Int. J. Comput. Vis. 2004, 57, 137–154. [Google Scholar]
- Dongleware. Available online: http://www.dongleware.de (accessed on 26 March 2014).
- Wolfgang, J.-K. On the preferred viewing distances to screen and document at VDU workplaces. Ergonomics 1990, 33, 1055–1063. [Google Scholar]
- Student's T-Test. Available online: http://en.wikipedia.org/wiki/Student's_t-test (accessed on 14 July 2014).
- Effect Size. Available onlie: http://en.wikipedia.org/wiki/Effect_size#Cohen.27s_d (accessed on 14 July 2014).
- Cohen, J. A power primer. Psychol. Bull. 1992, 112, 155–159. [Google Scholar]
- Correlation and Dependence. Available online: http://en.wikipedia.org/wiki/Correlation_and_dependence (accessed on 14 July 2014).
Category | Method | Advantages | Disadvantage | |
---|---|---|---|---|
Using single modality | Camera-based method [2–5] | Eye blink [2–5] and eye movement [3] were analyzed. |
| Lower acquisition speed of images than bio-signal-based method. |
Bio-signal-based method [6–12] | EOG [6], EEG [7–9,12], ECG [10,13], and blink signal from EEG [11] were analyzed. | Faster acquisition speed of data than camera-based method. |
| |
Using multiple modalities | Multiple bio-signal based method [14] | Multiple bio-signals such as ECG, GSR, and SKT were measured. | Higher accuracy of eye fatigue measurement than single modality-based method. | More discomfort to user because of attachment of multiple sensors, which can induce incorrect eye fatigue measurement. |
Hybrid method using both bio-signal and camera-based methods | EEG and BR were measured [15]. |
| The accuracy enhancement of eye fatigue measurements using only two modalities (EEG and BR) is limited. | |
EEG, BR, and FT were measured(proposed method). |
| Additional thermal camera is required. |
Six Questions for SE |
---|
I have difficulties in seeing |
I have a strange feeling around the eyes |
My eyes feel tired |
I feel numb |
I feel dizzy looking at the screen |
I have a headache |
Before Watching 3D Display | After Watching 3D Display | |
---|---|---|
Average | 1.623 | 3.478 |
Standard deviation | 0.582 | 1.37 |
Electrode | AF3 | AF4 | F3 | F4 | ||||
---|---|---|---|---|---|---|---|---|
Before | After | Before | After | Before | After | Before | After | |
Average | 0.0921 | 0.1091 | 0.0916 | 0.1001 | 0.1059 | 0.1178 | 0.1033 | 0.1102 |
Standard deviation | 0.0435 | 0.0353 | 0.0368 | 0.0362 | 0.0575 | 0.0526 | 0.046 | 0.0472 |
P-value | 0.2491 | 0.5279 | 0.5575 | 0.6909 | ||||
Electrode | F7 | F8 | FC5 | FC6 | ||||
Before | After | Before | After | Before | After | Before | After | |
Average | 0.0906 | 0.1165 | 0.0763 | 0.1118 | 0.1081 | 0.1086 | 0.0859 | 0.107 |
Standard deviation | 0.0347 | 0.0431 | 0.0362 | 0.0823 | 0.0753 | 0.048 | 0.0403 | 0.0347 |
P-value | 0.0814 | 0.1424 | 0.9812 | 0.1365 | ||||
Electrode | O1 | O2 | P7 | P8 | ||||
Before | After | Before | After | Before | After | Before | After | |
Average | 0.1002 | 0.1159 | 0.1054 | 0.1218 | 0.0769 | 0.103 | 0.0984 | 0.1608 |
Standard deviation | 0.0494 | 0.0438 | 0.064 | 0.0475 | 0.0369 | 0.0415 | 0.0466 | 0.1343 |
P-value | 0.3662 | 0.4323 | 0.0795 | 0.1076 | ||||
Electrode | T7 | T8 | ||||||
Before | After | Before | After | |||||
Average | 0.1037 | 0.1324 | 0.0954 | 0.0999 | ||||
Standard deviation | 0.0668 | 0.0729 | 0.0568 | 0.0429 | ||||
P-value | 0.2708 | 0.8071 |
Before Watching 3D Display | During the Last 1 Min of Watching 3D Display | |
---|---|---|
Average | 18.667 | 22.6 |
Standard deviation | 9.832 | 10.041 |
Before Watching 3D Display | After Watching 3D Display | |
---|---|---|
Average | 15221.233 | 15099.446 |
Standard deviation | 94.511 | 79.937 |
Cohen's d | Effect Size | |
---|---|---|
EEG | 0.6644 | Large |
BR | 0.3958 | Medium |
FT | 1.3914 | Large |
SE | 1.763 | Large |
Gradient | R2 | Correlation | |
---|---|---|---|
EEG vs. BR | −0.33 | 0.1285 | −0.3585 |
EEG vs. FT | −0.1154 | 0.0156 | −0.125 |
EEG vs. SE | −0.1584 | 0.0421 | −0.2052 |
BR vs. FT | 0.0381 | 0.0014 | 0.038 |
BR vs. SE | 0.5582 | 0.4427 | 0.6653 |
FT vs. SE | 0.4593 | 0.3015 | 0.5491 |
EEG | BR | FT | SE | The Sum of All the Correlation Values with Other Data | |
---|---|---|---|---|---|
EEG | 1 | −0.3585 | −0.125 | −0.2052 | −0.6887 |
BR | −0.3585 | 1 | 0.038 | 0.6653 | 0.3448 |
FT | −0.125 | 0.038 | 1 | 0.5491 | 0.4621 |
SE | −0.2052 | 0.6653 | 0.5491 | 1 | 1.0092 |
© 2014 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license ( http://creativecommons.org/licenses/by/3.0/).
Share and Cite
Bang, J.W.; Heo, H.; Choi, J.-S.; Park, K.R. Assessment of Eye Fatigue Caused by 3D Displays Based on Multimodal Measurements. Sensors 2014, 14, 16467-16485. https://doi.org/10.3390/s140916467
Bang JW, Heo H, Choi J-S, Park KR. Assessment of Eye Fatigue Caused by 3D Displays Based on Multimodal Measurements. Sensors. 2014; 14(9):16467-16485. https://doi.org/10.3390/s140916467
Chicago/Turabian StyleBang, Jae Won, Hwan Heo, Jong-Suk Choi, and Kang Ryoung Park. 2014. "Assessment of Eye Fatigue Caused by 3D Displays Based on Multimodal Measurements" Sensors 14, no. 9: 16467-16485. https://doi.org/10.3390/s140916467
APA StyleBang, J. W., Heo, H., Choi, J. -S., & Park, K. R. (2014). Assessment of Eye Fatigue Caused by 3D Displays Based on Multimodal Measurements. Sensors, 14(9), 16467-16485. https://doi.org/10.3390/s140916467