Abstract
Olfaction has not been explored in virtual reality environments to the same extent as the visual and auditory senses. Much less research has been done with olfactory devices, and very few of them can be easily integrated into virtual reality applications. The inclusion of odor into virtual reality simulations using a chemical device involves challenges such as possible diffusion into undesired areas, slow dissipation, the definition of various parameters (e.g., concentration, frequency, and duration), and an appropriate software solution for controlling the diffusion of the odor. This paper aims to present a non-intrusive, mobile, low cost and wearable olfactory display, and a software service that allows the developer to easily create applications that include olfactory stimuli integrated with virtual reality headset glasses. We also present a case study conducted with 32 people to evaluate their satisfaction when using the olfactory display. Our findings indicate that our solution works as expected, producing odor properly and being easy to integrate to applications.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
Virtual environments most often explore only sight and hearing stimuli, with a distant third position for tactile inputs, which are however very restricted in comparison. Hence a number of other senses which are important for us and part of many of our experiences are ignored. In particular, olfaction is important to humans: it’s used to develop object awareness, perceiving the season and atmosphere of a place and a fundamental part of eating. It influences our behavior and offers greater potential for survival by allowing us to detect hazards in food and in the environment [1, 2]. Our sense of odor uses the olfactory nerve to connect the external world directly to the limbic system, which is composed of structures in the brain that deal with emotions (e.g., sadness, anger, happiness, fear, the startle reflex [3], voice pitch [4], pain [5], and memory [6, 7].
Olfactory displays (or odor interfaces) are devices which insert odors into virtual environments [8]. They may be multi user devices [9] within custom virtual reality rooms, such as a CAVE (Cave Automatic Virtual Environment), or individual devices that inject fragrances with low dispersion, for example by using an air cannon to blow odors towards the user’s nose [10]. There’s a long history of attempts to have such a system. Sensorama [11] was one of the first projects to incorporate olfactory stimuli with audio and visual information within a virtual reality environment. It also had movement, vibration and direct wind effects; however, it was not interactive.
Several studies have addressed the olfactory sense [12,13,14,15,16,17], but many challenges remain unsolved, such as the accurate definition of parameters for these displays (e.g., concentration, frequency, duration, the palette of components that can be mixed to create the odors, and the diffusion of fragrances into undesired spaces [18,19,20]), and even whether we perceive olfactory stimuli while sleep and how they can be used to manipulate our dreams [21]. Moreover, some studies indicate that gender and age influences human olfaction [22, 23]. Olfactory displays need to be triggered by the application on specific cues, and take into consideration that it may take some time for an odor to reach the user’s nose. In addition, the physical integration of olfactory displays with the rest of the system also poses a challenge. These are still some of the daunting problems to be solved before odor can be fully incorporated into virtual reality environments.
To increase the feeling of reality in an immersive environment space, and to mimic the real world more closely, we developed a non-intrusive, mobile, low-cost and wearable olfactory display that is plugged into any virtual reality headset, allowing odors to reach the user’s nose immediately in a controlled way that does not spread throughout the entire room. We also developed a representational state transfer (REST) [24] software service to easily integrate this olfactory display with virtual reality applications.
There are several different research domains that could benefit from a simple to use, widely available olfactory device. Mulsemedia work has been evolving lately and encompasses artistic expression, experimental and behavioral psychology, perfume design (with wide applications in the industries of fashion and cleaning products), safety training, virtual reality and immersive experiences and others.
The main contributions of this paper are threefold: (i) the creation of a non-intrusive, mobile, low-cost and wearable olfactory display that can be added to any virtual reality glasses-based headset; (ii) a software service that allows developers to easily control odors from their virtual reality applications; and (iii) a quantitative case study investigating the user satisfaction regarding our olfactive display.
The remainder of the paper is organized as follows. Section 2 discusses related work; Sect. 3 presents the proposed olfactory display and software service; Sect. 4 describes our case study; Sect. 5 gives the results and a discussion; and lastly, Sect. 6 presents the conclusions.
2 Related work
The perception of odors has specific aspects that are related to the nervous system. The stimulus is caused by several odorous molecules that are independent from each other. Vision and hearing are determined by frequencies of light and sound in a range that can be easily replicated by screens and speakers. But there are countless odorous molecules which cannot be synthesized at will and immediately. Due to these factors, the perception of odor has not been fully explored in virtual reality applications. Yanagida et al. [25] showed that the main problems are related to the human mechanism of olfaction, which is not based on primary elements.
Some research has focused on the development of odor synthesis and detection [13, 26,27,28,29]. Traditionally, odor generation is based on chemical molecules, and can be designed for stationary or mobile emission. This approach is not invasive, i.e., users receive the odor in the same way as in the real world. Dobbelstein et al. [30] presented inScent, which can be worn as a pendant on a necklace, and a software program that allows developers to add odors to their mobile applications. The odor is not emitted directly at the user’s nose, which can cause a delay in detection and requires the device to generate a significant amount of odor. Bordegoni et al. [31] also presented a necklace device for museum exhibitions, with a relatively small size of 150 \(\times\) 150 \(\times\) 60 mm (without the tubes). Yamada et al. [32] presented a wearable olfactory display in which air containing the odor is conveyed to the user’s nose via tubes. The odor is generated based on the user’s position, which is detected by a spatial localization sensor. Although it is a wearable device it is connected to a notebook. Platt [33] presented iSmell, which associates odors with web content.
Multiple odors can lead to the problem of lingering particles, requiring odor removal. Hasegawa et al. [34] presented Midair, which aims to control the spatial distribution of fragrances by generating electronically steerable ultrasound-driven narrow air flows. Another possible problem is the distribution of odor molecules in the environment, which can be overcome by adding a mask.
Research has also focused on the spatial and temporal control of odor rather than synthesis [32]. Tominaga et al. [35] presented the “Friend Park” (a feelable virtual space system including the transmission of odor and breeze), in which users can enter or leave four rooms via a doorway. In each room, the olfactory information associated with a room or an object is generated. Users move from room to room, but the olfactory device does not move. Users can also operate objects using a wind-force sensor. Herrera and McMahan [10] created a low-cost desktop-based olfactory display by limiting the user’s movements; this can only deliver one odor at a time. Niedenthal et al. [36] developed a handheld olfactory display connected via cable to a Raspberry Pi 3 computer, integrated into the controller of the HTC Vive VR system. The user holds the display with their hand. It was specifically designed to simulate situations in which objects are moved near to the nose to feel the fragrance. Micaroni et al. [37] presented an olfactory display with diverse air pumps and containers attached to headphones. It can be mounted, for example, on an Oculus Rift HMD (head mounted display), but it’s bulky and uncomfortable, not allowing the user to freely move around.
An interesting paper that investigates the odor effect in a virtual reality environment was presented by Baus and Bouchard [38], who compared ambient air, a pleasant odor, and an unpleasant odor. Their results indicate a stronger effect of unpleasant odors on the sense of presence. Ranasinghe et al. [39] showed a device that integrates thermal, wind and olfactory stimuli into virtual reality glasses. Their finds suggest that this integration can be feasible and effective, enhancing users sense of presence in virtual reality experiences. Covaci et al. [40] also evaluated olfactory and wind effects integrated with virtual reality systems, but they focused on 360\(^{\circ }\) non-interactive videos. Similar 360\(^{\circ }\) video experiments are presented by Guedes et al. [41] and Narciso et al. [42]. These experiments indicate that olfactory stimulus contributes to user presence in 360\(^{\circ }\) videos.
Most solutions are based on the use of odor molecules (chemicals). Another approach is a non-chemical implementation that stimulates the olfactory receptors in the nose with weak electrical pulses. The development of a non-chemical display is a difficult task that requires triggering the olfactory receptors in the olfactory epithelium [43, 44], for example by generating electrical pulses in the odor receptors in the nasal concha [45] or by implanting electrodes into the frontal lobe of the brain [46]. Although this is an interesting approach, it’s complex, costly and at this point invasive.
There are a few portable commercial products being introduced into the market. One is Feelreal,Footnote 1 which is not yet available for buying. OVR TechnologyFootnote 2 has a product in test phase, which is limited to 9 fragrances that they offer and little documentation is available. VAQSO VR,Footnote 3 which has 5 odor holders, 15 pre-defined odors, and is marketed for ¥330,000. Some other olfactory devices for diverse proposals (e.g, Scent Dome, AromaJet, ScentWave, and Exhalia Diffuser SBi4) exist, and details and comparisons are available in [15,16,17, 47, 48].
In summary, although several olfactory displays have been proposed and implemented, most of these are expensive or are hard to reproduce due to their complexity; they rarely offer mobility, and do not provide a software solution allowing them to be integrated easily with virtual reality applications.
The project presented in this paper involves both a hardware and software solution. The olfactory display was created using commodity equipment, and the proposed software service allows developers to build virtual reality applications easily. It can be physically integrated with any HMD. Users have free mobility.
3 Olfactory device
Our objective is to design a device with visual, auditory and olfactory outputs, using simple and inexpensive components. We chose smartphones to solve the visual and auditory displays, using Google Cardboard. Advances in game engines and mobile device hardware have allowed sophisticated virtual reality applications to run on smartphones. With smartphones we can use game engines for virtual reality software development, and use joysticks or other external devices for input.
Our requirements included being easy to attach to virtual reality glasses, lightweight, battery operated, low cost and allowing remote control through an API (Application Programming Interface) to turn it on and off. As discussed in this paper, there almost no commercial systems available on the market and we could not find an off-the-shelf olfactory device matching our requirements, so we chose to design it ourselves.
To build the olfactory device we needed a controller system and a fragrance emitter, plus electronics to drive the atomizer. For the controller we chose the Raspberry Pi 3 model B, a well-known small single board computer which is widely available and inexpensive. The dimensions of the system are 110 mm \(\times\) 56 mm \(\times\) 35 mm (circuit boards) and its total weight is 239 g (including battery and the odor repository). We used a Lithium Polymer Battery 3.7v 5 Ah.
Odors are generated using synthetic essences in liquid fragrances, which are vaporized by an ultrasonic atomizer (model KS-W20-112K, diameter: 20 mm, nominal frequency: 0.113 MHz, Load Capacitor: 3200 pF, RL: 200\(\Omega\), drive Level: 5 mW, input level: 5 V) controlled by the general-purpose input/output (GPIO) pins of the Raspberry Pi. We chose an ultrasound fragrance emitter due to the wide availability of liquid fragrances, because it is easy to control, has fast response times and emits enough flow to immediately reach the user’s nostrils. It works by rapidly vibrating a piezo ceramic component, which vaporizes the liquid in contact. We cannot control the intensity through voltage or current regulation, but intensity can be controlled by the amount of time that the fragrance is on, switching it on and off for small periods of time. We picked an atomizer model with a liquid holder. We implemented a voltage regulator and a diffuse controller to convert signal from the GPIO pins to the ultrasonic atomizer levels and a battery to provide power to the system. The total cost of these parts and materials is approximately $50 USD. The system architecture is illustrated in Fig. 1.
The final project was still relatively small, light and could be hidden under a cover, fitting the front of the virtual reality glasses. It’s shown in Fig. 2.
Communication between the Raspberry Pi and the smartphone was implemented using Wi-Fi. We had other options, such as Bluetooth, but we opted to use Wi-Fi because it allows remote control of the odor display from any other computer in the network and therefore could also work as a standalone remote controlled system, which is interesting for further studies. We can use either a local Wi-Fi network or the smartphone in access point (AP) mode for the connection.
We implemented a REST stack, with the Raspberry Pi running a HTTP server. REST is a well-known and widely used architecture, and the HTTP requests can be easily implemented in any software language, another reason to use Wi-Fi. The client application, running on the smartphone, sends requests (PUT/GET) to the olfactory display. The protocol contains commands to:
-
Start the odor emission and which vaporizer to use. Though our prototype only had one vaporizer, the protocol already contemplates adding more;
-
Stop the odor emission and which vaporizer to use.
-
Get the status of the olfactory device.
Figure 3 depicts the olfactory device behavioral model (UML State Diagram) consisting of states as well as the events that affect it. When the application starts, the olfactory device is on Idle status. If the application triggers the Start event, the odor emission is fired. During this state, if the application sends the Stop event, the olfactory device returns to the Idle status. The application also can send the event Status which will return true to the application if the olfactory device is emitting the odor, or false if not.
Future extensions of the control software can be easily created by defining new REST endpoints on the HTTP server (e.g. odor intensity control and odor selection). We developed a middleware component, named Olfactory, to mediate communication between the HTTP server code and the GPIO interface.
Our measurements show that the delay between a command and the emission of aerosol to be well under 1 s, a lag that is imperceptible to the user. Since we use events to control the start and end of odor delivery, we have perfect control of the duration of the odor.
4 Experiment
We carried out an experiment to evaluate user satisfaction (the degree to which users are free of discomfort and how much they like to use an application with the odor display). There is no consensus in the Human–Computer Interaction (HCI) area regarding the ideal sample size [49], so we followed Nielsen’s and Landauer’s [50] recommendations for medium to large size projects. The experiment was conducted with 32 users divided into two groups, one which was presented with a stream containing flower fragrance and another with pure water (control group).
Figure 4 depicts a scene from the application, which was developed with the UnityFootnote 4 game engine. When a user reaches the flowers along the path shown above, a stream of vapor is emitted containing either the flower fragrance (a lavender cologne from a popular local brand, which is a pleasant odor) or plain water. Although users process information unconsciously in many situations, such as when they navigate within an environment and don’t notice certain objects, in this experiment we tried to make the user aware of the odor of flowers by presenting it for 6 s. We selected this period of time to overcome the possibility that it would not be detected, since a pleasant odor takes more time to be detected than an unpleasant one [38, 51]. This difference in reaction times may be related to the significance of unpleasant odors for survival (i.e., the identification of spoiled food).
Usage in Unity is simple, made through an asset we created. A sphere or parallelepiped defines the volume in which the odor should be released (Fig. 5). The asset communicates with the display through the REST API, controlling the actual release. While this prototype has only one container, the asset also enables to select which odor to release for future devices with more emitters.
The experiment was broken down into four steps, which are listed in chronological order in Table 1. Thirty-two participants were recruited randomly at Brazilian publishing houseFootnote 5 for this study, and no compensation was offered in exchange for participation. The users did not have any type of training, to avoid creating any type of response based on the learning effect on olfactory sensory perception or the use of the equipment. Users were divided into two groups, picked randomly and given the exact same test conditions.
Participants were told it was a virtual reality experiment, and asked to read and sign the free consent form. After that they received the first pre-test questionnaire. The goal of the experiment was stated as an “investigation of influence of odor in virtual reality immersion”. No explanation of the setup or if the participant was in the odor or control group was given.
Participants reached the flowers about 5 s after the application was started, and at this time the flower fragrance or pure water was emitted.
Table 2 depicts the post-testing questionnaire to evaluate users satisfaction, which was filled out after they used the application. The Likert scale [52] was adopted for the questionnaire, with items indicating strongly disagree, disagree, neither agree nor disagree, agree, and strongly agree.
5 Analysis of the results
Based on the results of the pre-test questionnaire, it was possible to determine the profile of the participants: all users (100%) reported that they had never used a virtual reality application before; 72% were male and 28% female; the users were between 16 and 59 years old; the median age was 31; only one participant rarely used a computer, and the others used one every day; and one user did not play games, 13 played games rarely, seven played games once per week, and seven played games every day.
Post-test questions 3, 5, 6, 7 and 9 reflected negative aspects of the application, i.e., the answers with negative severity reflected a positive aspect. Figure 4 shows a color scale towards the positive and agreeable ends of the answer scales even if their experiences are negative. This scale is used in Tables 3 and 4. To enable the weighted average to be calculated to perform statistical tests, each color was given a weight (1–5) (see Fig. 6).
After Group 1 (the group with the flower odor) had tested the application, they filled out the post-test questionnaire. Table 3 gives details of the questionnaire and the results obtained in terms of the user experience. None of the participants left comments on the application; 84.8% of the answers gave positive feedback strongly agreeing or agreeing with the aspect evaluated, 11.1% of the answers gave neutral feedback, while 4.1% gave negative feedback regarding the distraction caused, the pleasantness of the odor, the discomfort of the device and its performance.
After Group 2 (the group with water – (control group)) had tested the application, they also filled out the questionnaire. Table 4 gives details of the questionnaire and the results obtained in regard to the user experience. None of the participants left comments on the application. In summary, 32.6% of the answers gave neutral feedback for the experience with water. However, even without an odor, 50% of them agreed or strongly agreed with the questions regarding immersion, perception of odor, and the use of this technology in applications. For 17.4% of respondents, the experience was below their expectations, the aerosol took a long time to release the odor at the moment when they viewed the flowers, they didn’t detect the odor or feel immersed in the environment, and were negatively distracted by the odor during navigation. Neither group reported any discomfort during the experience due to the device or the aerosol.
Figure 7 represents the opinions of the participants (very negative, negative, neutral, positive or very positive) in terms of their answers to the questionnaire. We notice a greater concentration of responses on the very positive and positive scale for group 1, the one which experienced the virtual environment with the odor of flowers. We conclude that using a fragrance increased the user satisfaction with the application.
The comparison is more striking when we consider questions directly related to the experience individually, showing the aroma group had a better experience, compared with a two-tailed t-test.
Question 2 (“I detected a floral odor”) shows that our fragrance and was easily detected by the users of the odor group (\(M = 4.500\), SD = 0.612) compared to the second group (\(M = 2.938\), SD = 0.747), \(p = {6.70}\mathrm {e}{-7}\), validating that our odor device was correctly emitting the odor at an appreciable level. We also noticed there was no difference at all between the presence and absence of odor on the comfort of the device regarding aerosol (question 6), \(p = 1.000\).
Evaluation of the satisfaction was also remarkably different, with the aroma group finding it more pleasurable (\(M = 4.313\), SD = 0.845) than the second one (\(M=2.875\), SD = 1.166), \(p={5.52}\mathrm {e}{-4}\). It’s also relevant to notice that the feeling of immersion (question 1) was also considered higher by the odor group (\(M = 4.500\), SD = 0.500) versus the second (M = 4.063, SD = 0.658), \(p = 0.0493 < 0.05\). Figure 8 depicts the averages for each question according to the two groups.
We calculated Cronbach’s \(\alpha\) coefficients for both datasets, which are respectively 0.775 and 0.783, suggesting that they have high internal consistency.
6 Conclusions
In this paper we present a non-intrusive, mobile, low-cost, wearable olfactory display that can be integrated with virtual reality glasses.
Future virtual reality applications should not only use visual and audio stimuli but reproduce our complete sensorial experience, including odor. Olfaction is an important perceptual function for human beings, able to alter heart rate, respiration rate, blood oxygen, skin resilience and blood pressure. It is processed by the brain in relation to context, anticipation, and previous learning, enhancing the user’s experiences through emotions and memory, although we are rarely conscious of it. However, it is rarely used in virtual reality applications and insufficient effort has been made so far to include it. While there are a couple of commercial projects developing an olfactory display, none of them is currently available for buying.
The olfactory display we developed can be fitted to the front of any virtual reality glasses, since it is small and light, leaving it very close to the user’s nose. Hence it provides full mobility and the odor reaches the user’s nose quickly. The solution does not have appendages that can bother the user (i.e., pipes). It uses commercially available off-the-shelf components, all affordable and easily purchased. The result is a low cost and easy-to-reproduce system. While our first prototype was built with just one fragrance repository, it was designed for multiple odors.
New applications with odor can be quickly developed using the software service described here. This device has the potential to be used in virtual reality applications across different domains, such as tourism, marketing, education, training, and health. For example, it could assist the diagnosis of anosmia in patients suspected of COVID-19. We observed that users of the application with odors reported a better experience compared with those who used the application with pure water, when we analyzed immersion, user satisfaction and perception. We argue that this olfactory device generated a positive user experience for users.
As this work was intended to validate the prototype, there are some limitations that will be addressed in future works, particularly the simple application with one odor. A more complex application with more than one fragrance is important to mimic more realistic scenarios. We are now creating a new version with more than one odor repository, and creating other virtual reality applications with different odors and game engines. In future work, the same study could be done using an unpleasant odor, to compare user perception.
Notes
Version 2020.1.0b11.3880.
References
Danthiir, V., Roberts, R.D., Pallier, G., Stankov, L.: What the nose knows: olfaction and cognitive abilities. Intelligence 29(4), 337–361 (2001). https://doi.org/10.1016/S0160-2896(01)00061-7.
McGann, J.P.: Poor human olfaction is a 19th-century myth. Science (2017). https://doi.org/10.1126/science.aam7263
Miltner, W., Matjak, M., Braun, C., Diekmann, H., Brody, S.: Emotional qualities of odors and their influence on the startle reflex in humans. Psychophysiology 31(1), 107–110 (1994). https://doi.org/10.1111/j.1469-8986.1994.tb01030.x
Millot, J.L., Brand, G.: Effects of pleasant and unpleasant ambient odors on human voice pitch. Neurosci. Lett. 297(1), 61–63 (2001). https://doi.org/10.1016/S0304-3940(00)01668-2.682. Accessed 12 Mar 2022
Marchand, S., Arsenault, P.: Odors modulate pain perception: a gender-specific effect. Physiol. Behav. 76(2), 251–256 (2002). https://doi.org/10.1016/S0031-9384(02)00703-5.
Morrow, B.A., Roth, R.H., Elsworth, J.D.: TMT, a predator odor, elevates mesoprefrontal dopamine metabolic activity and disrupts short-term working memory in the rat. Brain Res. Bull. 52(6), 519–523 (2000). https://doi.org/10.1016/S0361-9230(00)00290-2.
Moss, M., Cook, J., Wesnes, K., Duckett, P.: Aromas of rosemary and lavender essential oils differentially affect cognition and mood in healthy adults. Int. J. Neurosci. 113(1), 15–38 (2003). https://doi.org/10.1080/00207450390161903
Barfield, W., Danas, E.: Comments on the use of olfactory displays for virtual environments. Presence Teleoper. In: Zyda M (ed.). Virtual Environ. 5(1), 109–121 (1996). https://doi.org/10.1162/pres.1996.5.1.109
Pletts,J., Turin, L.: Scents of Space. http://www.haque.co.uk/scentsofspace.php
Herrera, N.S., McMahan, R.P.: Development of a simple and low-cost olfactory display for immersive media experiences. In: Proceedings of the 2nd ACM International Workshop on Immersive Media Experiences, ImmersiveMe ’14, pp. 1–6. Association for Computing Machinery, New York, NY, USA (2014). https://doi.org/10.1145/2660579.2660584. (Event-place: Orlando, Florida, USA)
Heilig, M.L.: Sensorama simulator (1962). https://patents.google.com/patent/US3050870A/en. (Library Catalog: Google Patents)
Chen, Y.: Olfactory display: development and application in virtual reality therapy. In: 16th International Conference on Artificial Reality and Telexistence-Workshops (ICAT’06), pp. 580–584 (2006). https://doi.org/10.1109/ICAT.2006.95. (ISSN: null)
Nakamoto, T., Minh, H.P.D.: Improvement of olfactory display using solenoid valves. In: 2007 IEEE Virtual Reality Conference, pp. 179–186 (2007). https://doi.org/10.1109/VR.2007.352479. (Journal Abbreviation: 2007 IEEE Virtual Reality Conference)
Yanagida, Y.: A survey of olfactory displays: making and delivering scents. In: 2012 IEEE SENSORS, pp. 1–4 (2012). https://doi.org/10.1109/ICSENS.2012.6411380 (ISSN: 1930-0395)
Murray, N., Lee, B., Qiao, Y., Muntean, G.M.: Olfaction-enhanced multimedia: a survey of application domains, displays, and research challenges. ACM Comput. Surv. 48(4), 56:1-56:34 (2016). https://doi.org/10.1145/2816454
Braun, M.H.: Enhancing user experience with olfaction in virtual reality (2019). https://openaccess.city.ac.uk/id/eprint/22379/
Saleme, E.A.B., Covaci, A., Mesfin, G., Santos, C.A.S., Ghinea, G.: Mulsemedia diy: a survey of devices and a tutorial for building your own mulsemedia environment. ACM Comput. Surv. (2019). https://doi.org/10.1145/3319853
Cain, W.: To know with the nose: keys to odor identification. Science 203(4379), 467 (1979). https://doi.org/10.1126/science.760202.. Accessed 12 Mar 2022
Sugiyama, H., Ayabe-Kanamura, S., Kikuchi, T.: Are olfactory images sensory in nature? Perception 35(12), 1699–1708 (2006). https://doi.org/10.1068/p5453
Kato, S., Nakamoto, T.: Wearable olfactory display with less residual odor. In: 2019 IEEE International Symposium on Olfaction and Electronic Nose (ISOEN), pp. 1–3 (2019). https://doi.org/10.1109/ISOEN.2019.8823231
Braun, M.H., Cheok, A.D.: Towards an olfactory computer-dream interface. In: Proceedings of the 11th Conference on Advances in Computer Entertainment Technology, ACE ’14. Association for Computing Machinery, New York, NY, USA (2014). https://doi.org/10.1145/2663806.2663874
Sorokowski, P., Karwowski, M., Misiak, M., Marczak, M.K., Dziekan, M., Hummel, T., Sorokowska, A.: Sex differences in human olfaction: a meta-analysis. Front. Psychol. (2019). https://doi.org/10.3389/fpsyg.2019.00242
Wang, X., Zhang, C., Xia, X., Yang, Y., Zhou, C.: Effect of gender on odor identification at different life stages: a meta-analysis. Rhinology 57(5), 322–330 (2019). https://doi.org/10.4193/Rhin19.005
Pautasso, C., Wilde, E., Alarcon, R. (eds.): REST: Advanced Research Topics and Practical Applications, 2014th edn. Springer, Berlin (2013)
Yanagida, Y., Adachi, T., Miyasato, T., Tomono, A., Kawato,S., Noma, H., Hosaka,K.: Integrating a projection-based olfactory display with interactive audio-visual contents. In: Proceedings HCI International p. 10 (2005)
H. Iwata, H. Yano, T. Moriya, T. Uemura: Food simulator: a haptic interface for biting. In: Virtual Reality Conference, IEEE, p. 51 (2004). https://doi.org/10.1109/VR.2004.1310055. (Journal Abbreviation: Virtual Reality Conference, IEEE)
Hashimoto, K., Nakamoto, T.: Tiny olfactory display using surface acoustic wave device and micropumps for wearable applications. IEEE Sens. J. 16(12), 4974–4980 (2016). https://doi.org/10.1109/JSEN.2016.2550486
Nakamoto, T., Otaguro, S., Kinoshita, M., Nagahama, M., Ohinishi, K., Ishida, T.: Cooking up an interactive olfactory game display. IEEE Comput. Graphics Appl. 28(1), 75–78 (2008). https://doi.org/10.1109/MCG.2008.3
Yokoshiki, Y., Kakamoto, T.: Study of odor preconcentrator using SAW device. Proc. IMCS 2012, 477–480 (2012). https://doi.org/10.5162/IMCS2012/5.4.3.
Dobbelstein, D., Rukzio, E., Herrdum, S.: Demonstration of InScent: a wearable olfactory display as an amplification for mobile notifications. In: Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers, UbiComp ’17, pp. 229–232. Association for Computing Machinery, New York, NY, USA (2017). https://doi.org/10.1145/3123024.3123185. (Event-place: Maui, Hawaii)
Bordegoni, M., Carulli, M., Bader, S.: Wearable olfactory display for museum exhibitions. In: 2019 IEEE International Symposium on Olfaction and Electronic Nose (ISOEN), pp. 1–3 (2019). https://doi.org/10.1109/ISOEN.2019.8823224
Yamada, T., Yokoyama, S., Tanikawa, T., Hirota, K., Hirose, M.: wearable olfactory display: using odor in outdoor environment. In: IEEE Virtual Reality Conference (VR 2006), pp. 199–206 (2006). https://doi.org/10.1109/VR.2006.147. (Journal Abbreviation: IEEE Virtual Reality Conference (VR 2006))
Platt, C.: You’ve got smell! wired (1999). https://www.wired.com/1999/11/digiscent/
Hasegawa, K., Qiu, L., Shinoda, H.: Midair Ultrasound Fragrance Rendering. IEEE Trans. Vis. Comput. Graphics 24(4), 1477–1485 (2018). https://doi.org/10.1109/TVCG.2018.2794118
K. Tominaga, S. Honda, T. Ohsawa, H. Shigeno, K. Okada, Y. Matsushita.: “Friend Park”-expression of the wind and the scent on virtual space. In: Proceedings Seventh International Conference on Virtual Systems and Multimedia, pp. 507–515 (2001). https://doi.org/10.1109/VSMM.2001.969706. (Journal Abbreviation: Proceedings Seventh International Conference on Virtual Systems and Multimedia)
Niedenthal, S., Lundén, P., Ehrndal, M., Olofsson, J.K.: A handheld olfactory display for smell-enabled VR games. In: 2019 IEEE International Symposium on Olfaction and Electronic Nose (ISOEN), pp. 1–4 (2019). https://doi.org/10.1109/ISOEN.2019.8823162
Micaroni, L., Carulli, M., Ferrise, F., Gallace, A., Bordegoni, M.: An Olfactory display to study the integration of vision and olfaction in a virtual reality environment. J. Comput. Inf. Sci. Eng 19(3) (2019). https://doi.org/10.1115/1.4043068. https://asmedigitalcollection.asme.org/computingengineering/article/19/3/031015/632802/An-Olfactory-Display-to-Study-the-Integration-of. (Publisher: American Society of Mechanical Engineers Digital Collection)
Baus, O., Bouchard, S.: Exposure to an unpleasant odour increases the sense of Presence in virtual reality. Virtual Real. 21(2), 59–74 (2017). https://doi.org/10.1007/s10055-016-0299-3
Ranasinghe, N., Jain, P., Thi Ngoc Tram, N., Koh, K.C.R., Tolley, D., Karwita, S., Lien-Ya, L., Liangkun, Y., Shamaiah, K., Eason Wai Tung, C., Yen, C.C., Do, E.Y.L.: Season traveller: Multisensory narration for enhancing the virtual reality experience. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, CHI ’18, p. 1–13. Association for Computing Machinery, New York, NY, USA (2018). https://doi.org/10.1145/3173574.3174151
Covaci, A., Trestian, R., Saleme, E.a.B., Comsa, I.S., Assres, G., Santos, C.A.S., Ghinea, G.: 360\(^\circ\) mulsemedia: a way to improve subjective qoe in 360\(^\circ\) videos. In: Proceedings of the 27th ACM International Conference on Multimedia, MM ’19, p. 2378–2386. Association for Computing Machinery, New York, NY, USA (2019). https://doi.org/10.1145/3343031.3350954. Accessed 12 Mar 2022
Guedes, A.L.V., de A. Azevedo, R.G., Frossard, P., Colcher, S., Junqueira Barbosa, S.D.: Subjective evaluation of 360-degree sensory experiences. In: 2019 IEEE 21st International Workshop on Multimedia Signal Processing (MMSP), pp. 1–6 (2019). https://doi.org/10.1109/MMSP.2019.8901743. Accessed 12 Mar 2022
Narciso, D., Melo, M., Vasconcelos-Raposo, J., Bessa, M.: The impact of olfactory and wind stimuli on 360 videos using head-mounted displays. ACM Trans. Appl. Percept. (2020). https://doi.org/10.1145/3380903
Hariri, S., Mustafa, N.A., Karunanayaka, K., Cheok, A.D.: Electrical stimulation of olfactory receptors for digitizing smell. In: Proceedings of the 2016 Workshop on Multimodal Virtual and Augmented Reality, MVAR ’16. Association for Computing Machinery, New York, NY, USA (2016). https://doi.org/10.1145/3001959.3001964. (Event-place: Tokyo, Japan)
Ishimaru, T., Shimada, T., Sakumoto, M., Miwa, T., Kimura, Y., Furukawa, M.: Olfactory evoked potential produced by electrical stimulation of the human olfactory mucosa. Chem. Senses 22(1), 77–81 (1997). https://doi.org/10.1093/chemse/22.1.77
Cheok, A.D., Karunanayaka, K.: Virtual taste and smell technologies for multisensory internet and virtual reality. In: Human-Computer Interaction Series. Springer International Publishing (2018)
Kumar, G., Juhász, C., Sood, S., Asano, E.: Olfactory hallucinations elicited by electrical stimulation via subdural electrodes: Effects of direct stimulation of olfactory bulb and tract. Epilepsy Behav. 24(2), 264–268 (2012). https://doi.org/10.1016/j.yebeh.2012.03.027
Murray, N., Ademoye, O.A., Ghinea, G., Muntean, G.M.: A tutorial for olfaction-based multisensorial media application design and evaluation. ACM Comput. Surv. (2017). https://doi.org/10.1145/3108243
Dmitrenko, D., Vi, C.T., Obrist, M.: A comparison of scent-delivery devices and their meaningful use for in-car olfactory interaction. In: Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Automotive’UI 16, p. 23–26. Association for Computing Machinery, New York, NY, USA (2016). https://doi.org/10.1145/3003715.3005464
Lazar, D.J., Feng, D.J.H., Hochheiser, H.: Research Methods in Human-Computer Interaction, 2 edição Morgan Kaufmann Publishers, Cambridge, MA (2017)
Nielsen, J., Landauer, T.K.: A mathematical model of the finding of usability problems. In: Proceedings of the INTERACT ’93 and CHI ’93 Conference on Human Factors in Computing Systems, CHI ’93, p. 206–213. Association for Computing Machinery, New York, NY, USA (1993). https://doi.org/10.1145/169059.169166
Jacob, T.J., Wang, L.: A new method for measuring reaction times for odour detection at iso-intensity: Comparison between an unpleasant and pleasant odour. Physiol. Behav. 87(3), 500–505 (2006). https://doi.org/10.1016/j.physbeh.2005.11.018.
Likert, R.: A technique for the measurement of attitudes. Arch. Psychol. 22(140), 55 (1932)
Acknowledgements
This study was financed in part by the Coordenação de Aperfeiçoamento de Pessoal de Nível Superior-Brasil (CAPES)-Finance Code 001.
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by N. Murray.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Below is the link to the electronic supplementary material.
Supplementary file1 (MP4 44656 KB)
Rights and permissions
About this article
Cite this article
de Paiva Guimarães, M., Martins, J.M., Dias, D.R.C. et al. An olfactory display for virtual reality glasses. Multimedia Systems 28, 1573–1583 (2022). https://doi.org/10.1007/s00530-022-00908-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00530-022-00908-8