Nothing Special   »   [go: up one dir, main page]

Next Article in Journal
Exploring the Role of User Experience and Interface Design Communication in Augmented Reality for Education
Previous Article in Journal
User-Centered Evaluation Framework to Support the Interaction Design for Augmented Reality Applications
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Recall of Odorous Objects in Virtual Reality

by
Jussi Rantala
1,*,
Katri Salminen
2,
Poika Isokoski
1,
Ville Nieminen
3,
Markus Karjalainen
3,
Jari Väliaho
3,
Philipp Müller
1,
Anton Kontunen
3,
Pasi Kallio
3 and
Veikko Surakka
1
1
Faculty of Information Technology and Communication Sciences, Tampere University, 33100 Tampere, Finland
2
Tampere University of Applied Sciences, 33520 Tampere, Finland
3
Faculty of Medicine and Health Technology, Tampere University, 33100 Tampere, Finland
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2024, 8(6), 42; https://doi.org/10.3390/mti8060042
Submission received: 12 April 2024 / Revised: 2 May 2024 / Accepted: 14 May 2024 / Published: 21 May 2024
Figure 1
<p>Twelve objects were used in the experiment. From left to right and top to bottom: jasmine vase, teacup, lemon, small lemon tree, banana, donut, grass, milk carton, mushroom, apple, ice cream cone, and a rose.</p> ">
Figure 2
<p>Interaction with objects. Approaching a box (<b>a</b>), opening it (<b>b</b>), and picking up the object (<b>c</b>).</p> ">
Figure 3
<p>A participant wearing a VR headset (<b>a</b>), oxygen mask (<b>b</b>), and motion controller (<b>c</b>). Presentation of synthetic odors from an odor display (<b>d</b>) and authentic odors from flasks (<b>e</b>) was synced with participant’s interaction in VR (<b>f</b>).</p> ">
Figure 4
<p>Flowchart of the odor production and presentation system. Black arrows indicate the digital control and data transmission. Other arrows indicate tubing that carried clean air (blue), air with authentic odors (red), and air with synthetic odors (green).</p> ">
Figure 5
<p>Mean recall accuracies and standard errors of the mean (SEMs) for all objects by odor congruency and interaction type. C = congruent odor, IC = incongruent odor, and N = no odor.</p> ">
Figure 6
<p>Mean recall accuracies and SEMs for all objects by odor congruency, odor quality, and interaction type. A = authentic odor, S = synthetic odor, C = congruent odor, and IC = incongruent odor.</p> ">
Figure 7
<p>Mean ratings and SEMs for valence by odor congruency and interaction type. C = congruent odor, IC = incongruent odor, and N = no odor.</p> ">
Figure 8
<p>Mean ratings and SEMs for arousal by odor congruency and interaction type. C = congruent odor, IC = incongruent odor, and N = no odor.</p> ">
Versions Notes

Abstract

:
The aim was to investigate how the congruence of odors and visual objects in virtual reality (VR) affects later memory recall of the objects. Participants (N = 30) interacted with 12 objects in VR. The interaction was varied by odor congruency (i.e., the odor matched the object’s visual appearance, the odor did not match the object’s visual appearance, or the object had no odor); odor quality (i.e., an authentic or a synthetic odor); and interaction type (i.e., participants could look and manipulate or could only look at objects). After interacting with the 12 objects, incidental memory performance was measured with a free recall task. In addition, the participants rated the pleasantness and arousal of the interaction with each object. The results showed that the participants remembered significantly more objects with congruent odors than objects with incongruent odors or odorless objects. Furthermore, interaction with congruent objects was rated significantly more pleasant and relaxed than interaction with incongruent objects. Odor quality and interaction type did not have significant effects on recall or emotional ratings. These results can be utilized in the development of multisensory VR applications.

1. Introduction

The basis of many human experiences and memories is sensory information received through vision, hearing, smell, taste, and touch. Typically, a multisensory flow of information is experienced when interacting with the physical environment. Because many interactions are moving from physical spaces to virtual environments such as virtual reality (VR), creating digital multisensory experiences that imitate reality is a popular research topic. It has even been mentioned as the holy grail of human–computer interaction (HCI) [1]. Existing VR experiences mainly focus on stimulating vision, hearing, and touch [2,3,4], while smell and taste are less explored [1,5]. However, in the field of VR and HCI research, a bulk of works exist on utilizing the sense of smell in interactions [6,7,8,9]. Researchers have used odors in VR, for instance, to study the sense of presence [10,11,12,13,14,15], rehabilitation [16], affective and behavioral responses [1], relaxation [4], and temperature illusions [17]. A further line of research is to study how odors experienced in VR affect memory.
There is significant evidence that the human olfactory system is closely linked with memories and emotions [18,19,20,21,22]. Human olfaction is based on the ability to sense volatile organic compounds (VOCs). When entering the epithelium at the nasal cavity, VOCs launch a signal that is routed to specific brain areas, such as the amygdala and hippocampus. These structures play important roles in processing human memory and emotion functions [23,24,25,26]. The neuroanatomical connections involved in the sense of smell evoke autobiographical memories [19] that are more emotional than memories elicited by other stimuli [20,23].
In the context of VR, one of the first studies on exploring the connections between odors and memory was carried out by Dinh et al. [24]. Participants explored a virtual environment that contained various objects, such as a coffee maker. After visiting the environment, the participants were presented with questions that tested the memory for object locations. The results showed that the participants who received a coffee scent in the vicinity of the coffee maker remembered its location significantly better than those participants who did not receive the scent. The study focused on remembering locations instead of the virtual objects as such. Further, as there was no interaction with the objects, the connection between the coffee maker and the scent was not as tightly integrated as the scents in the current study. Another study [26] evaluated the effect of an ambient odor (i.e., an odor that does not emanate from a particular object but is present in the environment [25]) on the recall of visual items in a virtual environment. The results indicated that participants who received the ambient odor had significantly more correct responses to a recall questionnaire than the participants who received no odor. While the study suggested that odors can affect recall, again, the experimental setting used one ambient odor without any interaction with the objects. Thus, it is not known how the accurate integration of scents and objects affect recall. In the current study, enabling the manipulation of objects, such as grabbing them, is a natural way of creating an interaction.
There have also been studies where odors have not been found to enhance recall. Ghinea and Ademoye [27] investigated whether odors influenced the recall of information in multimedia videos. Their results showed that the recall in an experimental condition with odors was worse than in an odorless control condition. Furthermore, Sabiniewicz et al. [28] studied whether odors influence the pleasantness and memorization of visual 360° scenes. They utilized odors that were either congruent or incongruent with the visual information. The results suggested that the visual details were described as more pleasant when a congruent odor was used. However, regardless of the congruency, the odors had no effect on memorizing the visual details.
In sum, prior works suggest that adding odors to a virtual environment can positively influence memory [24,26], but this is not always the case [27,28]. Motivated by these mixed findings, the present study set out to study factors that could affect information recall in VR. First, most of the earlier studies used ambient odors that were present in the whole virtual environment [1,26,28,29], as opposed to delivering odors focused on specific objects [24]. With object-specific odors in interactive environments, it is possible to react to users’ actions [1] and create a very close spatiotemporal link between the visual environment and odors [30]. An example of this was Fragra, a visual–olfactory VR game where a player picked up a virtual food object and simultaneously perceived an odor that was either congruent or incongruent with the visual appearance of the object [31]. The player’s task in the preliminary evaluation was to judge whether the visual and olfactory cues were congruent. However, the recall of the objects was not studied. Despite a recent study [28], odor congruency is still an unexplored area in digital environments [32]. Second, the effect of odor quality on memory recall is not known. In the field of HCI, most olfactory displays utilize synthetic odors [33]. Synthetic odors are based on chemical compounds that represent only a subset of the compounds of an authentic odor. For the development of scent-enhanced VR applications, it is important to understand how accurately synthetic odors need to resemble authentic odors for a user to perceive them similarly [34]. For instance, is an odor emanating from a real lemon needed to influence the memorization of a virtual lemon? Third, the effect of interaction type has not been studied in earlier works. Participants have been able to visually inspect the environment, but interacting with or manipulating specific objects has not been possible [24,26,27,28]. Picking up a virtual object and inspecting it closely could reveal more visual details than only looking at it. These additional visual details, in turn, could affect later memory recall.
The current aim was to investigate how visual–olfactory congruence affects the recall of objects in VR. A system that enabled participants to see, smell, and manipulate objects within a maze-like virtual environment was developed. In the experiment, participants (N = 30) moved along a virtual corridor with 12 boxes. Each box contained a different object. Interaction with the objects was varied by odor congruency (i.e., the odor matched the object’s visual appearance, the odor did not match the object’s visual appearance, or the object had no odor); odor quality (i.e., an authentic or a synthetic odor); and interaction type (i.e., participants could look and manipulate or could only look at objects). After traversing the maze and opening all 12 boxes, incidental memory performance was assessed using an immediate free recall task, in which the participants listed as many objects as they could recall from the VR experience. After completing the memory task, the participants traveled the corridor again and were instructed to rate how they felt in response to each object using bipolar scales of valence (unpleasant–pleasant) and arousal (relaxed–aroused).

2. Methods

2.1. Participants

Thirty volunteers took part in the experiment (13 men, 17, women, mean age 24 years, age range 18–38 years). The volunteers were recruited by sending email advertisements within Tampere University to students and staff with higher education backgrounds. All were right-handed, and 16 had prior experience with VR. All participants signed a written consent form before proceeding with the experiment. The experimental protocol was approved by the Ethics Committee of the Tampere Region in Finland for nonmedical human studies. All participants reported having a normal sense of smell and no allergies or other medical conditions that would inhibit participation. The participants were compensated with a movie ticket.

2.2. Odors

Jasmine and lemon odors were selected for the experiment, because earlier research indicated that participants perceived the authentic and synthetic versions of these odors as similar [34]. To generate the authentic jasmine odor, jasmine oil from pure jasmine floral extract from India was diluted to 1% with propylene glycol (CAS number 57-55-6), which is a nearly odorless solvent with good solubility. The diluted jasmine oil was placed in a 10 mL brown glass flask. To create the synthetic jasmine odor, the content of the undiluted jasmine oil was first analyzed with mass spectrometry (HPLC-MS). The results showed that, of 11 different major components, the ones with the highest proportion and the most jasmine-like odor were benzyl acetate (CAS number 140-11-4) and cis-jasmone (CAS number 488-10-8). These two components were purchased from Sigma-Aldrich (Merck Life Science Oy, Espoo, Finland) and diluted to 15% with propylene glycol. The dilution ratios were defined in pretests by the authors to balance the odor intensities between the authentic and synthetic odors. The synthetic lemon odor was created with undiluted limonene purchased from Sigma-Aldrich (CAS number 5989-27-5). Limonene is described as having a lemon-like odor (Sigma-Aldrich Ingredients Catalogue for Flavors and Fragrances). The authentic lemon odor was created by placing 5 mL of freshly grated lemon peel in a flask. The amount of lemon peel was adjusted in pretests to match the intensities of authentic and synthetic lemon. Lemon peel was chosen over lemon essential oil to have an odor that was as authentic as possible.

2.3. Virtual Objects

The starting point in object selection was to include objects congruent with the lemon and jasmine odors and then select other common everyday items. The objects had to fulfill the following criteria: have an odor in the physical world, be easy to identify and name, and be able to be picked up. Using these criteria, the following 12 objects were selected for the experiment: jasmine vase, teacup, lemon, small lemon tree, banana, donut, grass, milk carton, mushroom, apple, ice cream cone, and rose. Each object had a corresponding 3D model (Figure 1). Eleven of the models were freely available for download via online 3D model marketplaces. The model of the jasmine vase was purchased from TurboSquid (https://www.turbosquid.com (accessed on 20 May 2024)).

2.4. Virtual Reality Environment

The objects were placed into a VR environment developed with Unreal Engine 4.16.2 (Epic Games, Cary, NC, USA). Participants interacted in the environment using a HTC Vive virtual reality headset (HTC Corporation, New Taipei City, Taiwan) and one HTC Vive motion controller. The environment consisted of a practice room with one box and a corridor with 12 boxes (Figure 2a). The motion controller was used for moving in the environment and opening the boxes. The location and orientation of the motion controller were visible to participants as a virtual hand in the VR. Moving from one box to another was based on a teleportation metaphor that required participants to point where to move with the virtual hand and then click the trackpad of the controller with the thumb to be transported to the new position. A box was opened by touching its door with the virtual hand.
To investigate the effect of interaction type on the recall of objects, the participants were divided into two groups. Half of the participants underwent a “look” condition, where the participants were instructed to look at the object (Figure 2b). The other half of the participants underwent a “manipulate” condition, where they were instructed to also pick up the object (Figure 2c) by pressing the controller’s trigger button with the index finger. The object remained attached to the hand for as long as the trigger was held down. The participant could rotate and visually explore the object from different angles before placing the object back in its box. Before proceeding to a new box, the door of the current box had to be closed by touching its door again with the virtual hand.
Each of the four odors was combined with one congruent and one incongruent object, so that 8 of the 12 objects were odorous. The four congruent odor–object pairs were formed by presenting the jasmine odors with the jasmine vase and the teacup and the lemon odors with the lemon and the lemon tree. The four incongruent odor–object pairs were formed by randomly selecting one of the remaining objects for each odor. The last four objects had no odor. It was expected that objects encountered at the beginning and the end of the corridor would be recalled with higher accuracy than items in the middle of the route. To minimize these primacy and recency effects in free recall [35,36], the order in which objects were placed in the boxes was randomized for each participant. As an example, Table 1 shows the odor–object pairs presented to Participant 1.

2.5. Odor Production and Presentation

After being seated, the participants were fitted with an oxygen mask (Figure 3b), motion controller in the dominant hand (Figure 3c), and the VR headset (Figure 3a). The oxygen mask (Intersurgical EcoLite Adult, Intersurgical Ltd., Wokingham, UK), covering the nose and mouth, was used similarly in an earlier study [34] to ensure the controlled delivery of odors.
An odor presentation system was developed to enable the controlled and temporally precise delivery of synthetic odors (Figure 3d) and authentic odors (Figure 3e) when the participant interacted with objects in VR (Figure 3f). The system is illustrated in detail in Figure 4. Room air from an air compressor (HBM AS-48, Waddinxveen, The Netherlands) was used as the carrier gas. The air was purified and dried with 5Å molecular and activated carbon sieves. The air pressure was set to 1 bar with a pressure regulator (RP1000-8G-02, CKD Corporation, Komaki, Japan) before the airflow was divided into two separate tubes for carrying the synthetic and authentic odors.
The synthetic odors were evaporated from liquid odorants using an olfactory display [34]. Each odorant was placed in a syringe and pumped with a syringe pump to a ceramic heating element for evaporation. The pumping rates and voltages of the heating elements were defined in Matlab R2015b (MathWorks, Natick, MA, USA) running on PC 1. To create synthetic jasmine, diluted benzyl acetate was pumped at a rate of 81 μL/h and 1.2 V, while diluted cis-jasmone was pumped at a rate of 81 μL/h and 1.2 V. To create synthetic lemon, limonene was pumped at a rate of 200 μL/h and 1.9 V. These parameters were defined in pretests by the authors so that the perceived intensities of jasmine and lemon would be similar. Depending on which of the two synthetic odors was presented next to the participant, the experimenter connected either the synthetic jasmine or the synthetic lemon tube manually to solenoid valve 1 (VX3134V-01F-6C1, SMC Corporation, Tokyo, Japan). The other synthetic odor was forwarded to an activated carbon cylinder. When participants interacted with an odorous object, PC 2—the computer responsible for running the VR environment—sent open and close commands to valve 1 through an Arduino Uno that controlled two MOSFET transistors to power the solenoid valves.
For releasing the authentic odors, the second tube coming from the pressure regulator was directed to valve 2 (VDW24WZ1DA, SMC Corporation, Japan). The valve controlled the airflow into a flask containing the authentic odor source, which was either the jasmine oil or the lemon peel. The odorous components of the jasmine oil and lemon peel evaporated in closed flasks, forming a headspace. Depending on which authentic odor was presented, the experimenter manually connected the correct flask to a cap that had an inlet for clean air and an outlet for odorized air. When valve 2 was opened, clean air was odorized by pushing it through the flask with an odorous headspace.
The tubes carrying synthetic and authentic odors were then combined into a single tube that was connected to the oxygen mask. The flow rate of odorous air pushed into the mask was 1.4 L/min. The rate was measured using a Gilian Gilibrator-2 NIOSH Primary Standard Air Flow Calibrator (Sensidyne, Schauenburg International GmbH, Mülheim an der Ruhr, Germany).

2.6. Procedure

The experimental tasks were presented in three blocks. In the first block, the participants were placed in the practice room, where they could get used to moving in the environment and opening a box. The box in the practice room contained an odorless grey sphere.
In the second block, the participants proceeded to the corridor with 12 boxes, each containing an object. In the “look” condition, the odor was released when the box containing an odorous object was opened. In the “manipulate” condition, the odor was released when the participants picked up the object. The duration of odor exposure was set to 10 s in both conditions. This duration was chosen because it is considered a sufficient time for consistent activation of the primary olfactory cortex without causing habituation [37]. The intensity of the odor was the same in both conditions. The participants were free to continue exploring the object after the odor exposure. The duration of the interaction was defined as the time interval from the opening to closing of the box. After the 12 boxes had been opened, the experimental gear was removed, and the participants were presented with an immediate free recall task. The participants were instructed to write on a paper the names of as many of the 12 objects as possible. Following the procedure of Bradley et al. [38], correct recall was scored if an object name could be clearly linked to one of the 12 objects. For example, a correct recall would be scored both for “ice cream” and “ice cream cone”. Partial recalls were not credited.
In the third block, the participants traveled the corridor once more. The order of the objects along the corridor was identical to the second block. This time, the participants were asked to rate how pleasant and aroused [39] they felt after interacting with each object. The ratings were given separately for each object after its box had been closed. The ratings used nine-point scales that varied from −4 to +4. For valence, the extremes of the scale were labeled unpleasant (−4) and pleasant (+4). The center point (0) represented a neutral experience that was neither unpleasant nor pleasant. The extremes of the arousal scale were relaxed (−4) and aroused (+4), and the center point (0) represented a neutral experience that was neither relaxing nor arousing. The experimenter read out the rating scales so that the participants could answer verbally without having to repeatedly remove the VR headset.
Conducting the whole experiment took approximately 40 min. The odor production system was flushed with clean air for 30 min after each participant to purge any residual odors.

2.7. Data Analysis

Because the recall accuracy and rating data were not normally distributed, ARTool [40] was used for aligned rank transformations before performing additional analyses with IBM SPSS Statistics (version 25).
Two separate analyses were performed for the recall accuracies. First, it was analyzed whether odor congruency or interaction type had an effect on the number of correctly recalled objects. A two-way, 3 × 2 (odor congruency × interaction type) mixed analysis of variance (ANOVA) with odor congruency as a within-subjects factor and interaction type as a between-subjects factor was used. Bonferroni-corrected pairwise t-tests were used for post hoc comparisons when needed. Then, it was analyzed whether odor quality had an effect on how many of the eight odorous objects the participants recalled. For this purpose, a three-way, 2 × 2 × 2 (odor quality × odor congruency × interaction type) mixed ANOVA with odor quality and odor congruency as within-subjects factors and interaction type as a between-subjects factor was conducted.
For the ratings of valence and arousal, two separate two-way, 3 × 2 (odor congruency × interaction type) mixed ANOVAs with odor congruency as a within-subjects factor and interaction type as a between-subjects factor were conducted.
A two-way, 3 × 2 (odor congruency × interaction type) mixed ANOVA with odor congruency as a within-subjects factor and interaction type as a between-subjects factor was run to analyze whether odor congruency or interaction type had an effect on the duration of the interaction.

3. Results

3.1. Recall Accuracy

For the recall accuracy of all 12 objects (Figure 5), a two-way mixed ANOVA showed a statistically significant main effect of odor congruency (F[2, 56] = 9.5, p < 0.01, η2 = 0.254). Post hoc pairwise comparisons showed that the participants recalled significantly more objects with congruent odors than with incongruent odors (MD = 20.1%, p = 0.01) or with no odor (MD = 21.7%, p < 0.01). The main effect of interaction type and the interaction of the main effects were not statistically significant.
Another analysis was performed to assess the possible effect of odor quality on the recall accuracy of the eight odorous objects (Figure 6, see page 9). A three-way mixed ANOVA showed a statistically significant main effect of odor congruency (F[1, 28] = 6.3, p = 0.018, η2 = 0.183). A post hoc pairwise comparison showed that the participants recalled significantly more objects with congruent odors than with incongruent odors (MD = 20.1%, p = 0.014). The main effects of odor quality, interaction type, and interactions of the main effects were not statistically significant.

3.2. Ratings of Valence and Arousal

For the ratings of valence (Figure 7), a two-way mixed ANOVA showed a statistically significant main effect of odor congruency (F[2, 56] = 10.3, p < 0.01, η2 = 0.269). Post hoc pairwise comparisons showed that the participants felt significantly more pleasant in response to congruent objects than incongruent objects (MD = 1.3, p < 0.01). The main effect of interaction type and the interaction of the main effects were not statistically significant.
For the ratings of arousal (Figure 8), a two-way mixed ANOVA showed a statistically significant main effect of odor congruency (F[2, 56] = 9.5, p < 0.01, η2 = 0.254). Post hoc pairwise comparisons showed that the participants experienced significantly higher arousal in response to incongruent objects than congruent objects (MD = 0.8, p < 0.01) or odorless objects (MD < 1.2, p = 0.01). The main effect of interaction type and the interaction of the main effects were not statistically significant.

3.3. Interaction Duration

For the duration of the interaction, a two-way mixed ANOVA did not show statistically significant main effects for odor congruency, interaction type, or interaction of the main effects.

4. Discussion

The results showed that the visual–olfactory congruence of objects in VR facilitated their immediate incidental recall. The participants remembered significantly more objects with congruent odors (83%) than objects with incongruent odors (63%) or objects with no odors (62%). To the best of our knowledge, this is the first study showing such a stimulus-specific visual–olfactory memory effect. An earlier work demonstrated a higher recall of objects when an ambient odor was present in the virtual environment [26]. Additional analyses of the current results showed that the effect was independent of the odor quality and interaction type.
The first possible explanation for the positive memory effect favoring congruent visual–olfactory information involves the observed changes in the ratings of valence and arousal. Interactions with congruent objects were significantly more pleasant than interactions with incongruent objects. In addition, interactions with incongruent objects were significantly more arousing than interactions with congruent and odorless objects. These results seem to suggest that experienced pleasantness could be associated with the better recall of visual–olfactory information. Contrary to this, earlier research on emotions and memory has shown that arousal is the key factor in engaging processes that modulate the storage of explicit memories [41,42,43,44]. Following this line of evidence, if increased arousal would explain the positive memory effect, incongruent objects would have been recalled more often in the study at hand. Regardless of the odor congruency, the mean ratings of valence and arousal were close to the neutral range, suggesting that the intensity of the emotional responses was mild. Thus, it seems unlikely that the increased pleasantness observed after congruent objects alone could explain the positive memory effect.
The second possible explanation for the memory effect is the multisensory processing of visual–olfactory information in working memory. Several studies using visual, auditory, and visual–auditory stimuli have shown that recall is better for cross-modal stimuli than for modality-specific stimuli [45,46,47]. This difference has been explained using the dual coding theory, which postulates that a memory trace for a cross-modal stimulus is a combination of the independent modality-specific sensory traces [48]. The existence of these two traces results in an advantage in memory recall. Although earlier studies mostly used visual, auditory, and visual–auditory stimuli, the performance benefit for cross-modal information processing also extends to odors and pictures [49]. It is vital to note that a cross-modal presentation per se is not sufficient to provide memory benefits [45]. The recall of incongruent visual–olfactory objects in the current study was worse than the recall of congruent visual–olfactory objects, potentially because it was more difficult for the participants to name—and consequently, more difficult to memorize—semantically incongruent information. The naming of perceived multimodal information is important, because evidence suggests that, regardless of the presentation modality, the content of the representation could be encoded in a verbal code when memorizing it [45]. Therefore, the format in which participants experience information is not necessarily the format in which information is encoded [50]. Semantically congruent visual–olfactory information could make it easier, for instance, to name a congruent lemon object and odor as “lemon”. Recognizing and naming the object requires that the multimodal working memory representation is integrated with semantic information from the long-term memory [50]. Taken together, the finding of a superior memory performance for congruent visual–olfactory objects is in line with the current theories of how humans process multisensory information in working memory.
The third possible reason that could explain the higher recall rates of congruent objects is the effect of odor congruency on attention. Earlier findings indicate that the amount and depth of stimulus processing are correlated with the strength of memory traces and subsequent ease and accuracy of retrieval [51,52]. The role of odor on attention was investigated in a study in which participants evaluated unfamiliar and familiar brand names with either a pleasant or unpleasant ambient odor [53]. The results showed that the presence of a pleasant ambient odor increased the amount of attention paid to unfamiliar brand names. As a result, participants exhibited a higher recall of unfamiliar brands. In the current study, the box opening and closing events were recorded to analyze whether odor congruency influenced the duration of interaction with the objects. No significant effect on odor congruency was found. Although this result seems to suggest that attention did not play a role in enhancing the recall of congruent objects, the mere duration from box opening to box closing may not be the best way to assess attention. An alternative would be to measure the focus of participants’ visual attention by measuring their eye movement behavior. An earlier study in a consumer context explored the effects of olfactory and visual cues on gaze patterns by showing the participants print advertisements that contained several objects, such as a lemon [54]. When the participants smelled a lemon odor, their eye gaze diverted to the visual image of a lemon in the ad. This diversion of visual attention also tended to have a positive effect on the recall of information. Furthermore, earlier research showed that an odor-related visual cue was explored faster and for a shorter time in the presence of the congruent odor [55]. Given these findings, it seems plausible that the participants’ gaze patterns in the current study may have differed in response to the presence and congruency of the odor. However, gaze patterns were not recorded. Studying this would be a possible direction for future research.
The recall effect was independent of the odor quality. This implies that the synthetic odors presented with the olfactory display were perceived to be sufficiently congruent with the visual appearance of the objects. This is interesting in light of earlier studies where enhanced recall was reported only with authentic odors [24,26]. The present findings argue for the use of synthetic odors in scent-enhanced VR applications. This is encouraging, given that many olfactory displays in HCI utilize mixtures of chemicals that do not fully recreate the complexity of the authentic odors. Furthermore, the interaction type did not have a significant effect on recall. This result is somewhat surprising, because participants in the “manipulate” condition typically brought the objects closer, rotated them, and, in theory, received more information on the object’s shape and texture than participants in the “look” condition. One possible explanation is that the objects chosen for the current experiment were easy to identify and name only by looking at them inside the boxes. Thus, the manipulation did not give crucial additional cues for memorization. For the design of virtual environments with several odorous objects, the findings suggest that physical manipulation does not automatically bring additional memory benefits compared to only visual inspection of objects.
One limitation of the current work was the use of only 12 visual and four olfactory stimuli. Although the number of olfactory stimuli was greater than in related studies [24,26], the results must be interpreted with caution until future studies assess the generalizability of the findings to other types of visual and olfactory stimuli. In addition, because the ratings of valence and arousal were close to the neutral range, it would be informative in future studies to include congruent visual–olfactory stimuli that elicit stronger emotional responses. This would help to study how odor congruency and the strength of the evoked emotions independently contribute to recall. Moreover, the used odor presentation system had certain limitations. The tubes carrying lemon and jasmine odors were combined into a single tube that delivered both odors to the participant. It is possible that cross-contamination of odors took place due to lingering artifacts when the participant picked up objects with different odors. However, significant cross-contamination between the lemon and jasmine odors would have likely resulted in smaller differences in the recall rates between congruent and incongruent objects.
This study has implications for the design of scent-enhanced VR applications. It was found that memory recall can be facilitated selectively for individual odorous objects, provided that the odors are semantically congruent with the visual appearance of the objects. This visual–olfactory congruency was also found to increase the perceived pleasantness. These findings support another recent work where it was shown that congruency between a pleasant odor and visual information in VR generates a better multisensory digital experience [1]. Thus, it can be argued that congruent visual–olfactory stimulation is a crucial factor when designing multisensory VR experiences. In addition to improving the overall digital experience, many possibilities remain to study the connections between odors, memory, and emotions in different application areas. For example, the present work is relevant to the field of sensory marketing. Earlier research using physical products has shown that product-specific odors enhance memories for product information more effectively than ambient odors [56]. Given the current results with virtual objects in VR, it seems plausible that olfactory displays capable of presenting product-specific odors could influence consumers’ ability to remember product information. In general, the addition of odors as a seamless part of multisensory VR can make the interaction more pleasant and memorable. Furthermore, the current results can be of interest to researchers studying how olfaction contributes to learning and training in VR [29,57,58,59]. Further works would be needed to understand whether the visual–olfactory congruency of objects can increase the memorability of the environment in educational applications.

5. Conclusions

In summary, the work reported in this paper suggests that the memory recall of an object is improved if the object is presented in VR with an odor that matches the object’s visual appearance. Therefore, if the designer of a multisensory VR experience chooses to use several odors instead of a single ambient odor, it is important to ensure the semantic congruence of all the selected visual–olfactory stimuli. The current findings also indicated that synthetic odors consisting of only a few chemical compounds can provide sufficient congruence with visual information. Finally, the visual–olfactory memory effect was independent of the interaction type. Merely looking at an object in VR while smelling a semantically congruent odor can be sufficient to improve memory recall. These findings can pave the way for the greater utilization of olfactory technology in creating multisensory VR experiences. In addition, the reported work gives support for further investigations of how odors affect human memory processes and how they could be utilized, for example, in developing applications that could help in memory rehabilitation.

Author Contributions

Conceptualization, J.R., K.S., P.I. and V.S.; methodology, J.R., K.S., P.I. and V.S.; software, V.N., M.K., J.V., A.K., J.R., P.I. and P.M.; validation, J.R., K.S. and V.S.; formal analysis, J.R., K.S. and V.S.; investigation, J.R. and K.S.; resources, V.N., M.K., J.V., A.K., J.R. and K.S.; data curation, J.R.; writing—original draft preparation, J.R., K.S., P.I. and V.S.; writing—review and editing, J.R., K.S., P.I., V.N., M.K., J.V., P.M., A.K., P.K. and V.S.; visualization, J.R. and K.S.; supervision, V.S. and P.K..; project administration, V.S. and P.K.; funding acquisition, V.S. and P.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Research Council of Finland, grant numbers 295432, 295433, 295434, 323498, 323529 and 323530.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Ethics Committee of Tampere University, Finland.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

All of the data is contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Flavián, C.; Ibáñez-Sánchez, S.; Orús, C. The Influence of Scent on Virtual Reality Experiences: The Role of Aroma-Content Congruence. J. Bus. Res. 2021, 123, 289–301. [Google Scholar] [CrossRef]
  2. Guttentag, D.A. Virtual Reality: Applications and Implications for Tourism. Tour. Manag. 2010, 31, 637–651. [Google Scholar] [CrossRef]
  3. LaViola, J.J., Jr.; Kruijff, E.; Bowman, D.; Poupyrev, I.P.; McMahan, R.P. 3D User Interfaces: Theory and Practice, 2nd ed.; Addison-Wesley: Hoboken, NJ, USA, 2017. [Google Scholar]
  4. Serrano, B.; Banos, R.M.; Botella, C. Virtual Reality and Stimulation of Touch and Smell for Inducing Relaxation: A Randomized Controlled Trial. Comput. Hum. Behav. 2016, 55, 1–8. [Google Scholar] [CrossRef]
  5. Narumi, T.; Nishizaka, S.; Kajinami, T.; Tanikawa, T.; Hirose, M. Augmented Reality Flavors: Gustatory Display Based on Edible Marker and Cross-Modal Interaction. In Proceedings of the 2011 Annual Conference on Human Factors in Computing Systems—CHI ’11, Vancouver, BC, Canada, 7–12 May 2011; ACM Press: New York, NY, USA, 2011; p. 93. [Google Scholar]
  6. Dmitrenko, D.; Maggioni, E.; Obrist, M. OSpace: Towards a Systematic Exploration of Olfactory Interaction Spaces. In Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces, Brighton, UK, 17–20 October 2017; ACM: New York, NY, USA, 2017; pp. 171–180. [Google Scholar]
  7. Holloman, A.K.; Crawford, C.S. Defining Scents: A Systematic Literature Review of Olfactory-Based Computing Systems. ACM Trans. Multimed. Comput. Commun. Appl. 2022, 18, 1–22. [Google Scholar] [CrossRef]
  8. Maggioni, E.; Cobden, R.; Dmitrenko, D.; Hornbæk, K.; Obrist, M. SMELL SPACE: Mapping out the Olfactory Design Space for Novel Interactions. ACM Trans. Comput. Hum. Interact. 2020, 27, 1–26. [Google Scholar] [CrossRef]
  9. Wang, Y.; Amores, J.; Maes, P. On-Face Olfactory Interfaces. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; ACM: New York, NY, USA, 2020; pp. 1–9. [Google Scholar]
  10. Baus, O.; Bouchard, S.; Nolet, K. Exposure to a Pleasant Odour May Increase the Sense of Reality, but Not the Sense of Presence or Realism. Behav. Inf. Technol. 2019, 38, 1369–1378. [Google Scholar] [CrossRef]
  11. Baus, O.; Bouchard, S. Exposure to an Unpleasant Odour Increases the Sense of Presence in Virtual Reality. Virtual Real. 2017, 21, 59–74. [Google Scholar] [CrossRef]
  12. Carulli, M.; Bordegoni, M.; Cugini, U. Visual-Olfactory Immersive Environment for Product Evaluation. In Proceedings of the 2015 IEEE Virtual Reality (VR), Arles, France, 23–27 March 2015; Volume 2, pp. 161–162. [Google Scholar]
  13. Hopf, J.; Scholl, M.; Neuhofer, B.; Egger, R. Exploring the Impact of Multisensory VR on Travel Recommendation: A Presence Perspective. In Information and Communication Technologies in Tourism 2020; Springer International Publishing: Cham, Switzerland, 2020; pp. 169–180. [Google Scholar]
  14. Jung, S.; Wood, A.L.; Hoermann, S.; Abhayawardhana, P.L.; Lindeman, R.W. The Impact of Multi-Sensory Stimuli on Confidence Levels for Perceptual-Cognitive Tasks in VR. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 463–472. [Google Scholar] [CrossRef]
  15. Ranasinghe, N.; Eason Wai Tung, C.; Yen, C.C.; Do, E.Y.-L.; Jain, P.; Thi Ngoc Tram, N.; Koh, K.C.R.; Tolley, D.; Karwita, S.; Lien-Ya, L.; et al. Season Traveller: Multisensory Narration for Enhancing the Virtual Reality Experience. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems—CHI ’18, Montreal, QC, Canada, 21–26 April 2018; ACM Press: New York, NY, USA, 2018; pp. 1–13. [Google Scholar]
  16. Covarrubias, M.; Bordegoni, M.; Rosini, M.; Guanziroli, E.; Cugini, U.; Molteni, F. VR System for Rehabilitation Based on Hand Gestural and Olfactory Interaction. In Proceedings of the 21st ACM Symposium on Virtual Reality Software and Technology, Beijing China, 13–15 November 2015; pp. 117–120. [Google Scholar] [CrossRef]
  17. Brooks, J.; Nagels, S.; Lopes, P. Trigeminal-Based Temperature Illusions. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; ACM: New York, NY, USA, 2020; pp. 1–12. [Google Scholar]
  18. Herz, R.S. Emotion Experienced during Encoding Enhances Odor Retrieval Cue Effectiveness. Am. J. Psychol. 1997, 110, 489–505. [Google Scholar] [CrossRef] [PubMed]
  19. Herz, R.S.; Eliassen, J.; Beland, S.; Souza, T. Neuroimaging Evidence for the Emotional Potency of Odor-Evoked Memory. Neuropsychologia 2004, 42, 371–378. [Google Scholar] [CrossRef] [PubMed]
  20. Herz, R.S.; Schooler, J.W. A Naturalistic Study of Autobiographical Memories Evoked by Olfactory and Visual Cues: Testing the Proustian Hypothesis. Am. J. Psychol. 2002, 115, 21. [Google Scholar] [CrossRef]
  21. Kadohisa, M. Effects of Odor on Emotion, with Implications. Front. Syst. Neurosci. 2013, 7, 1–6. [Google Scholar] [CrossRef] [PubMed]
  22. Zald, D.H.; Pardo, J.V. Emotion, Olfaction, and the Human Amygdala: Amygdala Activation during Aversive Olfactory Stimulation. Proc. Natl. Acad. Sci. USA 1997, 94, 4119–4124. [Google Scholar] [CrossRef] [PubMed]
  23. Herz, R.S. Are Odors the Best Cues to Memory? A Cross-Modal Comparison of Associative Memory Stimulia. Ann. N. Y. Acad. Sci. 1998, 855, 670–674. [Google Scholar] [CrossRef] [PubMed]
  24. Dinh, H.Q.; Walker, N.; Song, C.; Kobayashi, A.; Hodges, L.F. Evaluating the Importance of Multi-Sensory Input on Memory and the Sense of Presence in Virtual Environments. In Proceedings of the IEEE Virtual Reality (Cat. No. 99CB36316), Houston, TX, USA, 13–17 March 1999; pp. 222–228. [Google Scholar]
  25. Spangenberg, E.R.; Crowley, A.E.; Henderson, P.W. Improving the Store Environment: Do Olfactory Cues Affect Evaluations and Behaviors? J. Mark. 1996, 60, 67–80. [Google Scholar] [CrossRef]
  26. Tortell, R.; Luigi, D.-P.; Dozois, A.; Bouchard, S.; Morie, J.F.; Ilan, D. The Effects of Scent and Game Play Experience on Memory of a Virtual Environment. Virtual Real. 2007, 11, 61–68. [Google Scholar] [CrossRef]
  27. Ghinea, G.; Ademoye, O.A. Olfaction-Enhanced Multimedia: Bad for Information Recall? In Proceedings of the 2009 IEEE International Conference on Multimedia and Expo, New York, NY, USA, 28 June–3 July 2009; pp. 970–973. [Google Scholar]
  28. Sabiniewicz, A.; Schaefer, E.; Guducu, C.; Manesse, C.; Bensafi, M.; Krasteva, N.; Nelles, G.; Hummel, T. Smells Influence Perceived Pleasantness but Not Memorization of a Visual Virtual Environment. i-Perception 2021, 12, 204166952198973. [Google Scholar] [CrossRef] [PubMed]
  29. Moore, A.G.; Herrera, N.S.; Hurst, T.C.; McMahan, R.P.; Poeschl, S. The Effects of Olfaction on Training Transfer for an Assembly Task. In Proceedings of the 2015 IEEE Virtual Reality (VR), Arles, France, 23–27 March 2015; pp. 237–238. [Google Scholar]
  30. Toet, A.; Van Schaik, M.; Theunissen, N.C.M. No Effect of Ambient Odor on the Affective Appraisal of a Desktop Virtual Environment with Signs of Disorder. PLoS ONE 2013, 8, e78721. [Google Scholar] [CrossRef] [PubMed]
  31. Mochizuki, A.; Amada, T.; Sawa, S.; Takeda, T.; Motoyashiki, S.; Kohyama, K.; Imura, M.; Chihara, K. Fragra: A Visual-Olfactory VR Game. In Proceedings of the ACM SIGGRAPH 2004 Sketches on—SIGGRAPH ’04, Los Angeles, CA, USA, 8–12 August 2004; ACM Press: New York, NY, USA, 2004; Volume 2004, p. 123. [Google Scholar]
  32. Errajaa, K.; Legohérel, P.; Daucé, B. Immersion and Emotional Reactions to the Ambiance of a Multiservice Space: The Role of Perceived Congruence between Odor and Brand Image. J. Retail. Consum. Serv. 2018, 40, 100–108. [Google Scholar] [CrossRef]
  33. Yanagida, Y.; Tomono, A. Basics for Olfactory Display. In Human Olfactory Displays and Interfaces; IGI Global: Hershey, PA, USA, 2013; pp. 60–85. [Google Scholar]
  34. Salminen, K.; Leivo, J.; Telembeci, A.A.; Lekkala, J.; Kallio, P.; Surakka, V.; Rantala, J.; Isokoski, P.; Lehtonen, M.; Müller, P.; et al. Olfactory Display Prototype for Presenting and Sensing Authentic and Synthetic Odors. In Proceedings of the 2018 on International Conference on Multimodal Interaction—ICMI ’18, Boulder, CO, USA, 16–20 October 2018; ACM Press: New York, NY, USA, 2018; pp. 73–77. [Google Scholar]
  35. Davelaar, E.J.; Goshen-Gottstein, Y.; Ashkenazi, A.; Haarmann, H.J.; Usher, M. The Demise of Short-Term Memory Revisited: Empirical and Computational Investigations of Recency Effects. Psychol. Rev. 2005, 112, 3–42. [Google Scholar] [CrossRef] [PubMed]
  36. Glanzer, M.; Cunitz, A.R. Two Storage Mechanisms in Free Recall. J. Verbal Learn. Verbal Behav. 1966, 5, 351–360. [Google Scholar] [CrossRef]
  37. Poellinger, A.; Thomas, R.; Lio, P.; Lee, A.; Makris, N.; Rosen, B.R.; Kwong, K.K. Activation and Habituation in Olfaction—An fMRI Study. NeuroImage 2001, 13, 547–560. [Google Scholar] [CrossRef] [PubMed]
  38. Bradley, M.M.; Lang, P.J. Measuring Emotion: The Self-Assessment Manikin and the Semantic Differential. J. Behav. Ther. Exp. Psychiatry 1994, 25, 49–59. [Google Scholar] [CrossRef] [PubMed]
  39. Bradley, M.M.; Greenwald, M.K.; Petry, M.C.; Lang, P.J. Remembering Pictures: Pleasure and Arousal in Memory. J. Exp. Psychol. Learn. Mem. Cogn. 1992, 18, 379–390. [Google Scholar] [CrossRef] [PubMed]
  40. Wobbrock, J.O.; Findlater, L.; Gergle, D.; Higgins, J.J. The Aligned Rank Transform for Nonparametric Factorial Analyses Using Only Anova Procedures. In Proceedings of the 2011 Annual Conference on Human Factors in Computing Systems—CHI ’11, Vancouver, BC, Canada, 7–12 May 2011; ACM Press: New York, NY, USA, 2011; Volume 172, p. 143. [Google Scholar]
  41. Cahill, L.; McGaugh, J.L. Mechanisms of Emotional Arousal and Lasting Declarative Memory. Trends Neurosci. 1998, 21, 294–299. [Google Scholar] [CrossRef] [PubMed]
  42. LaBar, K.S.; Cabeza, R. Cognitive Neuroscience of Emotional Memory. Nat. Rev. Neurosci. 2006, 7, 54–64. [Google Scholar] [CrossRef] [PubMed]
  43. McGaugh, J.L.; Cahill, L.; Roozendaal, B. Involvement of the Amygdala in Memory Storage: Interaction with Other Brain Systems. Proc. Natl. Acad. Sci. USA 1996, 93, 13508–13514. [Google Scholar] [CrossRef] [PubMed]
  44. Phelps, E.A.; LeDoux, J.E. Contributions of the Amygdala to Emotion Processing: From Animal Models to Human Behavior. Neuron 2005, 48, 175–187. [Google Scholar] [CrossRef] [PubMed]
  45. Delogu, F.; Raffone, A.; Belardinelli, M.O. Semantic Encoding in Working Memory: Is There a (Multi)Modality Effect? Memory 2009, 17, 655–663. [Google Scholar] [CrossRef] [PubMed]
  46. Goolkasian, P.; Foos, P.W. Bimodal Format Effects in Working Memory. Am. J. Psychol. 2005, 118, 61–78. [Google Scholar] [CrossRef]
  47. Thompson, V.A.; Paivio, A. Memory for Pictures and Sounds: Independence of Auditory and Visual Codes. Can. J. Exp. Psychol. 1994, 48, 380–398. [Google Scholar] [CrossRef]
  48. Paivio, A. Mental Representations: A Dual Coding Approach; Oxford University Press: New York, NY, USA, 1986. [Google Scholar]
  49. Lyman, B.J.; McDaniel, M.A. Memory for Odors and Odor Names: Modalities of Elaboration and Imagery. J. Exp. Psychol. Learn. Mem. Cogn. 1990, 16, 656–664. [Google Scholar] [CrossRef]
  50. Quak, M.; London, R.E.; Talsma, D. A Multisensory Perspective of Working Memory. Front. Hum. Neurosci. 2015, 9, 1–11. [Google Scholar] [CrossRef] [PubMed]
  51. Baddeley, A. Human Memory: Theory and Practice; Psychology Press: London, UK, 1997. [Google Scholar]
  52. Craik, F.I.M.; Tulving, E. Depth of Processing and the Retention of Words in Episodic Memory. J. Exp. Psychol. Gen. 1975, 104, 268–294. [Google Scholar] [CrossRef]
  53. Morrin, M.; Ratneshwar, S. The Impact of Ambient Scent on Evaluation, Attention, and Memory for Familiar and Unfamiliar Brands. J. Bus. Res. 2000, 49, 157–165. [Google Scholar] [CrossRef]
  54. Lwin, M.O.; Morrin, M.; Chong, C.S.T.; Goh, S.X. Odor Semantics and Visual Cues: What We Smell Impacts Where We Look, What We Remember, and What We Want to Buy. J. Behav. Decis. Mak. 2016, 29, 336–350. [Google Scholar] [CrossRef]
  55. Seigneuric, A.; Durand, K.; Jiang, T.; Baudouin, J.-Y.; Schaal, B. The Nose Tells It to the Eyes: Crossmodal Associations between Olfaction and Vision. Perception 2010, 39, 1541–1554. [Google Scholar] [CrossRef] [PubMed]
  56. Krishna, A.; Lwin, M.O.; Morrin, M. Product Scent and Memory. J. Consum. Res. 2010, 37, 57–67. [Google Scholar] [CrossRef]
  57. Covaci, A.; Ghinea, G.; Lin, C.-H.; Huang, S.-H.; Shih, J.-L. Multisensory Games-Based Learning-Lessons Learnt from Olfactory Enhancement of a Digital Board Game. Multimed. Tools Appl. 2018, 77, 21245–21263. [Google Scholar] [CrossRef]
  58. Kwok, R.C.-W.; Cheng, S.H.; Ip, H.H.-S.; Kong, J.S.-L. Design of Affectively Evocative Smart Ambient Media for Learning. In Proceedings of the 2009 Workshop on Ambient Media Computing—AMC ’09, Beijing, China, 23 October 2009; ACM Press: New York, NY, USA, 2009; p. 65. [Google Scholar]
  59. Tijou, A.; Richard, E.; Richard, P. Using Olfactive Virtual Environments for Learning Organic Molecules. In Proceedings of the International Conference on Technologies for e-Learning and Digital Entertainment, Hangzhou, China, 16–19 April 2006; Volume 3942, pp. 1223–1233. [Google Scholar] [CrossRef]
Figure 1. Twelve objects were used in the experiment. From left to right and top to bottom: jasmine vase, teacup, lemon, small lemon tree, banana, donut, grass, milk carton, mushroom, apple, ice cream cone, and a rose.
Figure 1. Twelve objects were used in the experiment. From left to right and top to bottom: jasmine vase, teacup, lemon, small lemon tree, banana, donut, grass, milk carton, mushroom, apple, ice cream cone, and a rose.
Mti 08 00042 g001
Figure 2. Interaction with objects. Approaching a box (a), opening it (b), and picking up the object (c).
Figure 2. Interaction with objects. Approaching a box (a), opening it (b), and picking up the object (c).
Mti 08 00042 g002
Figure 3. A participant wearing a VR headset (a), oxygen mask (b), and motion controller (c). Presentation of synthetic odors from an odor display (d) and authentic odors from flasks (e) was synced with participant’s interaction in VR (f).
Figure 3. A participant wearing a VR headset (a), oxygen mask (b), and motion controller (c). Presentation of synthetic odors from an odor display (d) and authentic odors from flasks (e) was synced with participant’s interaction in VR (f).
Mti 08 00042 g003
Figure 4. Flowchart of the odor production and presentation system. Black arrows indicate the digital control and data transmission. Other arrows indicate tubing that carried clean air (blue), air with authentic odors (red), and air with synthetic odors (green).
Figure 4. Flowchart of the odor production and presentation system. Black arrows indicate the digital control and data transmission. Other arrows indicate tubing that carried clean air (blue), air with authentic odors (red), and air with synthetic odors (green).
Mti 08 00042 g004
Figure 5. Mean recall accuracies and standard errors of the mean (SEMs) for all objects by odor congruency and interaction type. C = congruent odor, IC = incongruent odor, and N = no odor.
Figure 5. Mean recall accuracies and standard errors of the mean (SEMs) for all objects by odor congruency and interaction type. C = congruent odor, IC = incongruent odor, and N = no odor.
Mti 08 00042 g005
Figure 6. Mean recall accuracies and SEMs for all objects by odor congruency, odor quality, and interaction type. A = authentic odor, S = synthetic odor, C = congruent odor, and IC = incongruent odor.
Figure 6. Mean recall accuracies and SEMs for all objects by odor congruency, odor quality, and interaction type. A = authentic odor, S = synthetic odor, C = congruent odor, and IC = incongruent odor.
Mti 08 00042 g006
Figure 7. Mean ratings and SEMs for valence by odor congruency and interaction type. C = congruent odor, IC = incongruent odor, and N = no odor.
Figure 7. Mean ratings and SEMs for valence by odor congruency and interaction type. C = congruent odor, IC = incongruent odor, and N = no odor.
Mti 08 00042 g007
Figure 8. Mean ratings and SEMs for arousal by odor congruency and interaction type. C = congruent odor, IC = incongruent odor, and N = no odor.
Figure 8. Mean ratings and SEMs for arousal by odor congruency and interaction type. C = congruent odor, IC = incongruent odor, and N = no odor.
Mti 08 00042 g008
Table 1. Object–odor pairs presented to Participant 1.
Table 1. Object–odor pairs presented to Participant 1.
BoxObjectOdorOdor–Object
Congruence
1Rose
2Jasmin vaseAuthentic jasmineCongruent
3Banana
4TeacupSynthetic jasmineCongruent
5DonutAuthentic lemonIncongruent
6MushroomSynthetic lemonIncongruent
7AppleAuthentic jasmineIncongruent
8Lemon treeSynthetic lemonCongruent
9LemonAuthentic lemonCongruent
10GrassSynthetic jasmineIncongruent
11Milk
12Ice cream
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Rantala, J.; Salminen, K.; Isokoski, P.; Nieminen, V.; Karjalainen, M.; Väliaho, J.; Müller, P.; Kontunen, A.; Kallio, P.; Surakka, V. Recall of Odorous Objects in Virtual Reality. Multimodal Technol. Interact. 2024, 8, 42. https://doi.org/10.3390/mti8060042

AMA Style

Rantala J, Salminen K, Isokoski P, Nieminen V, Karjalainen M, Väliaho J, Müller P, Kontunen A, Kallio P, Surakka V. Recall of Odorous Objects in Virtual Reality. Multimodal Technologies and Interaction. 2024; 8(6):42. https://doi.org/10.3390/mti8060042

Chicago/Turabian Style

Rantala, Jussi, Katri Salminen, Poika Isokoski, Ville Nieminen, Markus Karjalainen, Jari Väliaho, Philipp Müller, Anton Kontunen, Pasi Kallio, and Veikko Surakka. 2024. "Recall of Odorous Objects in Virtual Reality" Multimodal Technologies and Interaction 8, no. 6: 42. https://doi.org/10.3390/mti8060042

APA Style

Rantala, J., Salminen, K., Isokoski, P., Nieminen, V., Karjalainen, M., Väliaho, J., Müller, P., Kontunen, A., Kallio, P., & Surakka, V. (2024). Recall of Odorous Objects in Virtual Reality. Multimodal Technologies and Interaction, 8(6), 42. https://doi.org/10.3390/mti8060042

Article Metrics

Back to TopTop