1 Introduction

Virtual environments most often explore only sight and hearing stimuli, with a distant third position for tactile inputs, which are however very restricted in comparison. Hence a number of other senses which are important for us and part of many of our experiences are ignored. In particular, olfaction is important to humans: it’s used to develop object awareness, perceiving the season and atmosphere of a place and a fundamental part of eating. It influences our behavior and offers greater potential for survival by allowing us to detect hazards in food and in the environment [1, 2]. Our sense of odor uses the olfactory nerve to connect the external world directly to the limbic system, which is composed of structures in the brain that deal with emotions (e.g., sadness, anger, happiness, fear, the startle reflex [3], voice pitch [4], pain [5], and memory [6, 7].

Olfactory displays (or odor interfaces) are devices which insert odors into virtual environments [8]. They may be multi user devices [9] within custom virtual reality rooms, such as a CAVE (Cave Automatic Virtual Environment), or individual devices that inject fragrances with low dispersion, for example by using an air cannon to blow odors towards the user’s nose [10]. There’s a long history of attempts to have such a system. Sensorama [11] was one of the first projects to incorporate olfactory stimuli with audio and visual information within a virtual reality environment. It also had movement, vibration and direct wind effects; however, it was not interactive.

Several studies have addressed the olfactory sense [12,13,14,15,16,17], but many challenges remain unsolved, such as the accurate definition of parameters for these displays (e.g., concentration, frequency, duration, the palette of components that can be mixed to create the odors, and the diffusion of fragrances into undesired spaces [18,19,20]), and even whether we perceive olfactory stimuli while sleep and how they can be used to manipulate our dreams [21]. Moreover, some studies indicate that gender and age influences human olfaction [22, 23]. Olfactory displays need to be triggered by the application on specific cues, and take into consideration that it may take some time for an odor to reach the user’s nose. In addition, the physical integration of olfactory displays with the rest of the system also poses a challenge. These are still some of the daunting problems to be solved before odor can be fully incorporated into virtual reality environments.

To increase the feeling of reality in an immersive environment space, and to mimic the real world more closely, we developed a non-intrusive, mobile, low-cost and wearable olfactory display that is plugged into any virtual reality headset, allowing odors to reach the user’s nose immediately in a controlled way that does not spread throughout the entire room. We also developed a representational state transfer (REST) [24] software service to easily integrate this olfactory display with virtual reality applications.

There are several different research domains that could benefit from a simple to use, widely available olfactory device. Mulsemedia work has been evolving lately and encompasses artistic expression, experimental and behavioral psychology, perfume design (with wide applications in the industries of fashion and cleaning products), safety training, virtual reality and immersive experiences and others.

The main contributions of this paper are threefold: (i) the creation of a non-intrusive, mobile, low-cost and wearable olfactory display that can be added to any virtual reality glasses-based headset; (ii) a software service that allows developers to easily control odors from their virtual reality applications; and (iii) a quantitative case study investigating the user satisfaction regarding our olfactive display.

The remainder of the paper is organized as follows. Section 2 discusses related work; Sect. 3 presents the proposed olfactory display and software service; Sect. 4 describes our case study; Sect. 5 gives the results and a discussion; and lastly, Sect. 6 presents the conclusions.

2 Related work

The perception of odors has specific aspects that are related to the nervous system. The stimulus is caused by several odorous molecules that are independent from each other. Vision and hearing are determined by frequencies of light and sound in a range that can be easily replicated by screens and speakers. But there are countless odorous molecules which cannot be synthesized at will and immediately. Due to these factors, the perception of odor has not been fully explored in virtual reality applications. Yanagida et al. [25] showed that the main problems are related to the human mechanism of olfaction, which is not based on primary elements.

Some research has focused on the development of odor synthesis and detection [13, 26,27,28,29]. Traditionally, odor generation is based on chemical molecules, and can be designed for stationary or mobile emission. This approach is not invasive, i.e., users receive the odor in the same way as in the real world. Dobbelstein et al. [30] presented inScent, which can be worn as a pendant on a necklace, and a software program that allows developers to add odors to their mobile applications. The odor is not emitted directly at the user’s nose, which can cause a delay in detection and requires the device to generate a significant amount of odor. Bordegoni et al. [31] also presented a necklace device for museum exhibitions, with a relatively small size of 150 \(\times\) 150 \(\times\) 60 mm (without the tubes). Yamada et al. [32] presented a wearable olfactory display in which air containing the odor is conveyed to the user’s nose via tubes. The odor is generated based on the user’s position, which is detected by a spatial localization sensor. Although it is a wearable device it is connected to a notebook. Platt [33] presented iSmell, which associates odors with web content.

Multiple odors can lead to the problem of lingering particles, requiring odor removal. Hasegawa et al. [34] presented Midair, which aims to control the spatial distribution of fragrances by generating electronically steerable ultrasound-driven narrow air flows. Another possible problem is the distribution of odor molecules in the environment, which can be overcome by adding a mask.

Research has also focused on the spatial and temporal control of odor rather than synthesis [32]. Tominaga et al. [35] presented the “Friend Park” (a feelable virtual space system including the transmission of odor and breeze), in which users can enter or leave four rooms via a doorway. In each room, the olfactory information associated with a room or an object is generated. Users move from room to room, but the olfactory device does not move. Users can also operate objects using a wind-force sensor. Herrera and McMahan [10] created a low-cost desktop-based olfactory display by limiting the user’s movements; this can only deliver one odor at a time. Niedenthal et al. [36] developed a handheld olfactory display connected via cable to a Raspberry Pi 3 computer, integrated into the controller of the HTC Vive VR system. The user holds the display with their hand. It was specifically designed to simulate situations in which objects are moved near to the nose to feel the fragrance. Micaroni et al. [37] presented an olfactory display with diverse air pumps and containers attached to headphones. It can be mounted, for example, on an Oculus Rift HMD (head mounted display), but it’s bulky and uncomfortable, not allowing the user to freely move around.

An interesting paper that investigates the odor effect in a virtual reality environment was presented by Baus and Bouchard [38], who compared ambient air, a pleasant odor, and an unpleasant odor. Their results indicate a stronger effect of unpleasant odors on the sense of presence. Ranasinghe et al. [39] showed a device that integrates thermal, wind and olfactory stimuli into virtual reality glasses. Their finds suggest that this integration can be feasible and effective, enhancing users sense of presence in virtual reality experiences. Covaci et al. [40] also evaluated olfactory and wind effects integrated with virtual reality systems, but they focused on 360\(^{\circ }\) non-interactive videos. Similar 360\(^{\circ }\) video experiments are presented by Guedes et al. [41] and Narciso et al. [42]. These experiments indicate that olfactory stimulus contributes to user presence in 360\(^{\circ }\) videos.

Most solutions are based on the use of odor molecules (chemicals). Another approach is a non-chemical implementation that stimulates the olfactory receptors in the nose with weak electrical pulses. The development of a non-chemical display is a difficult task that requires triggering the olfactory receptors in the olfactory epithelium [43, 44], for example by generating electrical pulses in the odor receptors in the nasal concha [45] or by implanting electrodes into the frontal lobe of the brain [46]. Although this is an interesting approach, it’s complex, costly and at this point invasive.

There are a few portable commercial products being introduced into the market. One is Feelreal,Footnote 1 which is not yet available for buying. OVR TechnologyFootnote 2 has a product in test phase, which is limited to 9 fragrances that they offer and little documentation is available. VAQSO VR,Footnote 3 which has 5 odor holders, 15 pre-defined odors, and is marketed for ¥330,000. Some other olfactory devices for diverse proposals (e.g, Scent Dome, AromaJet, ScentWave, and Exhalia Diffuser SBi4) exist, and details and comparisons are available in [15,16,17, 47, 48].

In summary, although several olfactory displays have been proposed and implemented, most of these are expensive or are hard to reproduce due to their complexity; they rarely offer mobility, and do not provide a software solution allowing them to be integrated easily with virtual reality applications.

The project presented in this paper involves both a hardware and software solution. The olfactory display was created using commodity equipment, and the proposed software service allows developers to build virtual reality applications easily. It can be physically integrated with any HMD. Users have free mobility.

3 Olfactory device

Our objective is to design a device with visual, auditory and olfactory outputs, using simple and inexpensive components. We chose smartphones to solve the visual and auditory displays, using Google Cardboard. Advances in game engines and mobile device hardware have allowed sophisticated virtual reality applications to run on smartphones. With smartphones we can use game engines for virtual reality software development, and use joysticks or other external devices for input.

Our requirements included being easy to attach to virtual reality glasses, lightweight, battery operated, low cost and allowing remote control through an API (Application Programming Interface) to turn it on and off. As discussed in this paper, there almost no commercial systems available on the market and we could not find an off-the-shelf olfactory device matching our requirements, so we chose to design it ourselves.

To build the olfactory device we needed a controller system and a fragrance emitter, plus electronics to drive the atomizer. For the controller we chose the Raspberry Pi 3 model B, a well-known small single board computer which is widely available and inexpensive. The dimensions of the system are 110 mm \(\times\) 56 mm \(\times\) 35 mm (circuit boards) and its total weight is 239 g (including battery and the odor repository). We used a Lithium Polymer Battery 3.7v 5 Ah.

Odors are generated using synthetic essences in liquid fragrances, which are vaporized by an ultrasonic atomizer (model KS-W20-112K, diameter: 20 mm, nominal frequency: 0.113 MHz, Load Capacitor: 3200 pF, RL: 200\(\Omega\), drive Level: 5 mW, input level: 5 V) controlled by the general-purpose input/output (GPIO) pins of the Raspberry Pi. We chose an ultrasound fragrance emitter due to the wide availability of liquid fragrances, because it is easy to control, has fast response times and emits enough flow to immediately reach the user’s nostrils. It works by rapidly vibrating a piezo ceramic component, which vaporizes the liquid in contact. We cannot control the intensity through voltage or current regulation, but intensity can be controlled by the amount of time that the fragrance is on, switching it on and off for small periods of time. We picked an atomizer model with a liquid holder. We implemented a voltage regulator and a diffuse controller to convert signal from the GPIO pins to the ultrasonic atomizer levels and a battery to provide power to the system. The total cost of these parts and materials is approximately $50 USD. The system architecture is illustrated in Fig. 1.

Fig. 1
figure 1

Architecture overview

The final project was still relatively small, light and could be hidden under a cover, fitting the front of the virtual reality glasses. It’s shown in Fig. 2.

Fig. 2
figure 2

Virtual reality glasses with an olfactory display

Communication between the Raspberry Pi and the smartphone was implemented using Wi-Fi. We had other options, such as Bluetooth, but we opted to use Wi-Fi because it allows remote control of the odor display from any other computer in the network and therefore could also work as a standalone remote controlled system, which is interesting for further studies. We can use either a local Wi-Fi network or the smartphone in access point (AP) mode for the connection.

We implemented a REST stack, with the Raspberry Pi running a HTTP server. REST is a well-known and widely used architecture, and the HTTP requests can be easily implemented in any software language, another reason to use Wi-Fi. The client application, running on the smartphone, sends requests (PUT/GET) to the olfactory display. The protocol contains commands to:

  • Start the odor emission and which vaporizer to use. Though our prototype only had one vaporizer, the protocol already contemplates adding more;

  • Stop the odor emission and which vaporizer to use.

  • Get the status of the olfactory device.

Figure 3 depicts the olfactory device behavioral model (UML State Diagram) consisting of states as well as the events that affect it. When the application starts, the olfactory device is on Idle status. If the application triggers the Start event, the odor emission is fired. During this state, if the application sends the Stop event, the olfactory device returns to the Idle status. The application also can send the event Status which will return true to the application if the olfactory device is emitting the odor, or false if not.

Fig. 3
figure 3

Olfactory device State Diagram

Future extensions of the control software can be easily created by defining new REST endpoints on the HTTP server (e.g. odor intensity control and odor selection). We developed a middleware component, named Olfactory, to mediate communication between the HTTP server code and the GPIO interface.

Our measurements show that the delay between a command and the emission of aerosol to be well under 1 s, a lag that is imperceptible to the user. Since we use events to control the start and end of odor delivery, we have perfect control of the duration of the odor.

4 Experiment

We carried out an experiment to evaluate user satisfaction (the degree to which users are free of discomfort and how much they like to use an application with the odor display). There is no consensus in the Human–Computer Interaction (HCI) area regarding the ideal sample size [49], so we followed Nielsen’s and Landauer’s [50] recommendations for medium to large size projects. The experiment was conducted with 32 users divided into two groups, one which was presented with a stream containing flower fragrance and another with pure water (control group).

Figure 4 depicts a scene from the application, which was developed with the UnityFootnote 4 game engine. When a user reaches the flowers along the path shown above, a stream of vapor is emitted containing either the flower fragrance (a lavender cologne from a popular local brand, which is a pleasant odor) or plain water. Although users process information unconsciously in many situations, such as when they navigate within an environment and don’t notice certain objects, in this experiment we tried to make the user aware of the odor of flowers by presenting it for 6 s. We selected this period of time to overcome the possibility that it would not be detected, since a pleasant odor takes more time to be detected than an unpleasant one [38, 51]. This difference in reaction times may be related to the significance of unpleasant odors for survival (i.e., the identification of spoiled food).

Fig. 4
figure 4

The developed virtual reality application

Usage in Unity is simple, made through an asset we created. A sphere or parallelepiped defines the volume in which the odor should be released (Fig. 5). The asset communicates with the display through the REST API, controlling the actual release. While this prototype has only one container, the asset also enables to select which odor to release for future devices with more emitters.

Fig. 5
figure 5

How an application is developed with our asset in Unity. A sphere or box delimiting the odor volume is created. When the user is inside that volume the odor is automatically released

The experiment was broken down into four steps, which are listed in chronological order in Table 1. Thirty-two participants were recruited randomly at Brazilian publishing houseFootnote 5 for this study, and no compensation was offered in exchange for participation. The users did not have any type of training, to avoid creating any type of response based on the learning effect on olfactory sensory perception or the use of the equipment. Users were divided into two groups, picked randomly and given the exact same test conditions.

Table 1 Experiment execution steps

Participants were told it was a virtual reality experiment, and asked to read and sign the free consent form. After that they received the first pre-test questionnaire. The goal of the experiment was stated as an “investigation of influence of odor in virtual reality immersion”. No explanation of the setup or if the participant was in the odor or control group was given.

Participants reached the flowers about 5 s after the application was started, and at this time the flower fragrance or pure water was emitted.

Table 2 depicts the post-testing questionnaire to evaluate users satisfaction, which was filled out after they used the application. The Likert scale [52] was adopted for the questionnaire, with items indicating strongly disagree, disagree, neither agree nor disagree, agree, and strongly agree.

Table 2 Post-test questionnaire

5 Analysis of the results

Based on the results of the pre-test questionnaire, it was possible to determine the profile of the participants: all users (100%) reported that they had never used a virtual reality application before; 72% were male and 28% female; the users were between 16 and 59 years old; the median age was 31; only one participant rarely used a computer, and the others used one every day; and one user did not play games, 13 played games rarely, seven played games once per week, and seven played games every day.

Post-test questions 3, 5, 6, 7 and 9 reflected negative aspects of the application, i.e., the answers with negative severity reflected a positive aspect. Figure 4 shows a color scale towards the positive and agreeable ends of the answer scales even if their experiences are negative. This scale is used in Tables 3 and 4. To enable the weighted average to be calculated to perform statistical tests, each color was given a weight (1–5) (see Fig. 6).

Fig. 6
figure 6

Color scale towards the positive and agreeable ends of answer scales even if their experiences are negative

After Group 1 (the group with the flower odor) had tested the application, they filled out the post-test questionnaire. Table 3 gives details of the questionnaire and the results obtained in terms of the user experience. None of the participants left comments on the application; 84.8% of the answers gave positive feedback strongly agreeing or agreeing with the aspect evaluated, 11.1% of the answers gave neutral feedback, while 4.1% gave negative feedback regarding the distraction caused, the pleasantness of the odor, the discomfort of the device and its performance.

Table 3 Answers for Group 1 (with flower fragrance)
Table 4 Answers for Group 2 (with pure water)

After Group 2 (the group with water – (control group)) had tested the application, they also filled out the questionnaire. Table 4 gives details of the questionnaire and the results obtained in regard to the user experience. None of the participants left comments on the application. In summary, 32.6% of the answers gave neutral feedback for the experience with water. However, even without an odor, 50% of them agreed or strongly agreed with the questions regarding immersion, perception of odor, and the use of this technology in applications. For 17.4% of respondents, the experience was below their expectations, the aerosol took a long time to release the odor at the moment when they viewed the flowers, they didn’t detect the odor or feel immersed in the environment, and were negatively distracted by the odor during navigation. Neither group reported any discomfort during the experience due to the device or the aerosol.

Figure 7 represents the opinions of the participants (very negative, negative, neutral, positive or very positive) in terms of their answers to the questionnaire. We notice a greater concentration of responses on the very positive and positive scale for group 1, the one which experienced the virtual environment with the odor of flowers. We conclude that using a fragrance increased the user satisfaction with the application.

Fig. 7
figure 7

Evaluation of user satisfaction. Histogram of responses for all questions for the two groups, compared with the curve of a normal distribution for the corresponding average and standard deviation. Group 1 AVG: 4.264, STDDEV: 0.816. Group 2 (control) AVG: 3.500, STDDEV: 0.927

The comparison is more striking when we consider questions directly related to the experience individually, showing the aroma group had a better experience, compared with a two-tailed t-test.

Question 2 (“I detected a floral odor”) shows that our fragrance and was easily detected by the users of the odor group (\(M = 4.500\), SD = 0.612) compared to the second group (\(M = 2.938\), SD = 0.747), \(p = {6.70}\mathrm {e}{-7}\), validating that our odor device was correctly emitting the odor at an appreciable level. We also noticed there was no difference at all between the presence and absence of odor on the comfort of the device regarding aerosol (question 6), \(p = 1.000\).

Evaluation of the satisfaction was also remarkably different, with the aroma group finding it more pleasurable (\(M = 4.313\), SD = 0.845) than the second one (\(M=2.875\), SD = 1.166), \(p={5.52}\mathrm {e}{-4}\). It’s also relevant to notice that the feeling of immersion (question 1) was also considered higher by the odor group (\(M = 4.500\), SD = 0.500) versus the second (M = 4.063, SD = 0.658), \(p = 0.0493 < 0.05\). Figure 8 depicts the averages for each question according to the two groups.

We calculated Cronbach’s \(\alpha\) coefficients for both datasets, which are respectively 0.775 and 0.783, suggesting that they have high internal consistency.

Fig. 8
figure 8

Averages for each question according to the two groups, normalized so 1 is the worst grade and 5 the best grade. Based on the results of our evaluation, we argue that the solution presented was a positive addition to our virtual reality application

6 Conclusions

In this paper we present a non-intrusive, mobile, low-cost, wearable olfactory display that can be integrated with virtual reality glasses.

Future virtual reality applications should not only use visual and audio stimuli but reproduce our complete sensorial experience, including odor. Olfaction is an important perceptual function for human beings, able to alter heart rate, respiration rate, blood oxygen, skin resilience and blood pressure. It is processed by the brain in relation to context, anticipation, and previous learning, enhancing the user’s experiences through emotions and memory, although we are rarely conscious of it. However, it is rarely used in virtual reality applications and insufficient effort has been made so far to include it. While there are a couple of commercial projects developing an olfactory display, none of them is currently available for buying.

The olfactory display we developed can be fitted to the front of any virtual reality glasses, since it is small and light, leaving it very close to the user’s nose. Hence it provides full mobility and the odor reaches the user’s nose quickly. The solution does not have appendages that can bother the user (i.e., pipes). It uses commercially available off-the-shelf components, all affordable and easily purchased. The result is a low cost and easy-to-reproduce system. While our first prototype was built with just one fragrance repository, it was designed for multiple odors.

New applications with odor can be quickly developed using the software service described here. This device has the potential to be used in virtual reality applications across different domains, such as tourism, marketing, education, training, and health. For example, it could assist the diagnosis of anosmia in patients suspected of COVID-19. We observed that users of the application with odors reported a better experience compared with those who used the application with pure water, when we analyzed immersion, user satisfaction and perception. We argue that this olfactory device generated a positive user experience for users.

As this work was intended to validate the prototype, there are some limitations that will be addressed in future works, particularly the simple application with one odor. A more complex application with more than one fragrance is important to mimic more realistic scenarios. We are now creating a new version with more than one odor repository, and creating other virtual reality applications with different odors and game engines. In future work, the same study could be done using an unpleasant odor, to compare user perception.