Rediscovering the forgotten sense

Smell is processed in a deep, central part of the brain called the olfactory bulb. Scent receptors detect smell molecules and pass information to the brain. 

Since smell is embedded within the most basic functions of the brain, it can release strong emotions, activate memories and trigger cognitive processes. 

There is growing scientific interest in smell and how it can be applied. Smell affects much more than we previously realised.

Smell and our emotions

Our sense of smell is one of the most powerful human senses and is an integral part of how we experience reality in our everyday life. Smells are all around us, and even if we are not aware of them they can influence our emotions, intentions, behaviours and even trigger our memories

Smell enhances our wellbeing

Smell plays a large role in our physical and emotional wellbeing. Using smell combinations and sequences can help alleviate stress or stimulate your senses. It can even be used to manage medical conditions such as depression and neurodegenerative disease.

Smell health transforms our future

Smell plays a large role in our physical and emotional wellbeing. Smell training has been proven to help people recover from these illnesses and improve overall wellbeing and independence.

OW’ solutions supported by deep science

You can read more about the science behind the OW’s digital smell technology solutions and applications in our scientific publications.

SmellControl: The Study of Sense of Agency in Smell.

Patricia Cornelio, Emanuela Maggioni, Giada Brianza, Sriram Subramanian and Marianna Obrist. ACM International Conference on Multimodal Interaction (ICMI 2020). [DOI] [Video] – Best Paper Award

The Sense of Agency (SoA) is crucial in interaction with technology, it refers to the feeling of ‘I did that’ as opposed to ‘the system did that’ supporting a feeling of being in control. Research in human-computer interaction has recently studied agency in visual, auditory and haptic interfaces, however the role of smell on agency remains unknown. Our sense of smell is quite powerful to elicit emotions, memories and awareness of the environment, which has been exploited to enhance user experiences (e.g., in VR and driving scenarios). In light of increased interest in designing multimodal interfaces including smell and its close link with emotions, we investigated, for the first time, the effect of smell-induced emotions on the SoA. We conducted a study using the Intentional Binding (IB) paradigm used to measure SoA while participants were exposed to three scents with different valence (pleasant, unpleasant, neutral). Our results show that participants? SoA increased with a pleasant scent compared to neutral and unpleasant scents. We discuss how our results can inform the design of multimodal and future olfactory interfaces.

SMELL SPACE: Mapping out the Olfactory Design Space for Novel Interactions.

Emanuela Maggioni, Robert Cobden, Dmitrijs Dmitrenko, Kasper Hornbæk and Marianna Obrist. In ACM Transactions on Computer-Human Interaction (TOCHI 2020). [DOI]

The human sense of smell is powerful. However, the way we use smell as an interaction modality in human–computer interaction (HCI) is limited. We lack a common reference point to guide designers’ choices when using smell. Here, we map out an olfactory design space to provide designers with such guidance. We identified four key design features: (i) chemical, (ii) emotional, (iii) spatial, and (iv) temporal. Each feature defines a building block for smell-based interaction design and is grounded in a review of the relevant scientific literature. We then demonstrate the design opportunities in three application cases. Each application (i.e., one desktop, two virtual reality implementations) highlights the design choices alongside the implementation and evaluation possibilities in using smell. We conclude by discussing how identifying those design features facilitates a healthy growth of this research domain and contributes to an intermediate-level knowledge space. Finally, we discuss further challenges the HCI community needs to tackle.

Communicating Cosmology with Multisensory Metaphorical Experiences.

Roberto Trotta, Daniel Hajas, José Eliel Camargo-Molina, Robert Cobden, Emanuela Maggioni and Marianna Obrist. In Journal of Science Communication, 2020. [DOI]

We present a novel approach to communicating abstract concepts in cosmology and astrophysics in a more accessible and inclusive manner. We describe an exhibit aiming at creating an immersive, multisensory metaphorical experience of an otherwise imperceptible physical phenomenon — dark matter. Human-Computer Interaction experts and physicists co-created a multisensory journey through dark matter by exploiting the latest advances in haptic and olfactory technology. We present the concept design of a pilot and a second, improved event, both held at the London Science Museum, including the practical setup of the multisensory dark matter experience, the delivery of sensory stimulation and preliminary insights from users’ feedback.

CARoma Therapy: Pleasant Scents Promote Safer Driving Better Mood, and Improved Well-Being in Angry Drivers.

Dmitrenko, D, Maggioni, E., Brianza, G., Holthausen, B. E., Walker, B. N., Obrist, M. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. (CHI ’20). [DOI]

Driving is a task that is often affected by emotions. The effect of emotions on driving has been extensively studied. Anger is an emotion that dominates in such investigations. Despite the knowledge on strong links between scents and emotions, few studies have explored the effect of olfactory stimulation in a context of driving. Such an outcome provides HCI practitioners very little knowledge on how to design for emotions using olfactory stimulation in the car. We carried out three studies to select scents of different valence and arousal levels (i.e. rose, peppermint, and civet) and anger eliciting stimuli (i.e. affective pictures and on-road events). We used this knowledge to conduct the fourth user study investigating how the selected scents change the emotional state, well-being, and driving behaviour of drivers in an induced angry state. Our findings enable better decisions on what scents to choose when designing interactions for angry drivers.

Beyond Vision: How Learning Can Be Reimagined Through Smell and Multisensory Experiences.

Maggioni, E. In The Curve Magazine (2019), The new brain issue, 9, pp: 42-44, 2019.

The human brain is a multisensory dynamic learning system. However, the learning experience has, traditionally, largely been limited to a static process based mainly on vision (books, blackboards, screens, etc.) and occasional audio – but completely neglecting the other sensory modalities. Restricting the learning experience in this way has defined the standards of our learning styles. Consequently, studies around learning have focused on a single sensory modality, even though research in the field of memory and learning shows that multisensory experiences can promote better cognitive performance and memory retention. Storytelling using multiple modalities is starting to be introduced, particularly around extending accessibility in the case of sensory impairment or for more realistic training or rehabilitation procedures – firefighter training or PTSD treatments, for example.

“Like Popcorn”: Investigating Crossmodal Correspondences Between Scents, 3D Shapes and Emotions in Children.

Oussama Metatla, Emanuela Maggioni, Clare Cullen and Marianna Obrist. In Proceedings of the ACM CHI Conference on Human Factors in Computing Systems. Glasgow, Scotland, 2019. [DOI]

There is increasing interest in multisensory experiences in HCI. However, little research considers how sensory modalities interact with each other and how this may impact interactive experiences. We investigate how children associate emotions with scents and 3D shapes. 14 participants (10-17yrs) completed cross-modal association tasks to attribute emotional characteristics to variants of the “Bouba/Kiki” stimuli, presented as 3D tangible models, in conjunction with lemon and vanilla scents. Our findings support pre-existing mappings between shapes and scents, and confirm the associations between the combination of angular shapes (“Kiki”) and lemon scent with arousing emotion, and of round shapes (“Bouba”) and vanilla scent with calming emotion. This extends prior work on cross-modal correspondences in terms of stimuli (3D as opposed to 2D shapes), sample (children), and conveyed content (emotions). We outline how these findings can contribute to designing more inclusive interactive multisensory technologies.

OWidgets: A Toolkit To Enable Smell-based Experience Design.

Maggioni, E., Cobden, R., and Obrist, M. In International Journal of Human-Computer Studies. 2019. [DOI]

Interactive technologies are transforming the ways in which people experience, interact and share information. Advances in technology have made it possible to generate real and virtual environments with breath-taking graphics and high-fidelity audio. However, without stimulating the other senses such as touch and smell, and even taste in some cases, such experiences feel hollow and fictitious; they lack realism. One of the main stumbling blocks for progress towards creating truly compelling multisensory experiences is the lack of appropriate tools and guidance for designing beyond audio-visual applications. Here we focus particularly on the sense of smell and how smell-based design can be enabled to create novel user experiences. We present a design toolkit for smell (i.e., OWidgets). The toolkit consists of a graphical user interface and the underlying software framework. The framework uses two main components: a Mapper and Scheduler facilitating the device-independent replication of olfactory experiences. We discuss how our toolkit reduces the complexity of designing with smell and enables a creative exploration based on specific design features. We conclude by reflecting on future directions to extend the toolkit and integrate it into the wider audio-visual ecosystem.

As light as your scent: effects of smell and sound on body image perception.

Brianza, Giada, Tajadura-Jiméne, Ana, Maggioni, Emanuela, Pittera, Dario, Bianchi-Berthouze, Nadia and Obrist, Marianna. In Proceedings of the 17th IFIP TC 13 International Conference on Human-Computer Interaction. Interact 2019: 17th IFIP TC.13 International Conference on Human-Computer Interaction, Paphos, Cyprus, 2 – 6 September 2019.

How people mentally represent their body appearance (i.e., body image perception – BIP) does not always match their actual body. BIP distortions can lead to a detriment in physical and emotional health. Recent works in HCI have shown that technology can be used to change people’s BIP through visual, tactile, proprioceptive, and auditory stimulation. This paper investigates, for the first time, the effect of olfactory stimuli, by looking at a possible enhancement of a known auditory effect on BIP. We present two studies building on emerging knowledge in the field of crossmodal correspondences. First, we explored the correspondences between scents and body shapes. Then, we investigated the impact of combined scents and sounds on one’s own BIP. Our results show that scent stimuli can be used to make participants feel lighter or heavier (i.e., using lemon or vanilla) and to enhance the effect of sound on perceived body lightness. We discuss how these findings can inform future research and design directions to overcome body misperception and create novel augmented and embodied experiences.

S(C)ENTINEL: monitoring automated vehicles with olfactory reliability displays.

Philipp Wintersberger, Dmitrijs Dmitrenko, Clemens Schartmüller, Anna-Katharina Frison, Emanuela Maggioni, Marianna Obrist, and Andreas Riener. In Proceedings of the 24th International Conference on Intelligent User Interfaces (IUI ’19). ACM, New York, NY, USA. [DOI]

Overreliance in technology is safety-critical and it is assumed that this could have been a main cause of severe accidents with automated vehicles. To ease the complex task of permanently monitoring vehicle behavior in the driving environment, researchers have proposed to implement reliability/uncertainty displays. Such displays allow to estimate whether or not an upcoming intervention is likely. However, presenting uncertainty just adds more visual workload on drivers, who might also be engaged in secondary tasks. We suggest to use olfactory displays as a potential solution to communicate system uncertainty and conducted a user study (N=25) in a high-fidelity driving simulator. Results of the experiment (conditions: no reliability display, purely visual reliability display, and visual-olfactory reliability display) comping both objective (task performance) and subjective (technology acceptance model, trust scales, semi-structured interviews) measures suggest that olfactory notifications could become a valuable extension for calibrating trust in automated vehicles.

Smell-O-Message: Integration of Olfactory Notifications into a Messaging Application to Improve Users’ Performance.

Emanuela Maggioni, Robert Cobden, Dmitrijs Dmitrenko, and Marianna Obrist. 2018. In proceedings of the 20th ACM International Conference on Multimodal Interaction (ICMI ’18). , New York, NY, USA. [DOI]

Smell is a powerful tool for conveying and recalling information without requiring visual attention. Previous work identified, however, some challenges caused by user’s unfamiliarity with this modality and complexity in the scent delivery. We are now able to overcome these challenges, introducing a training approach to familiarise scent-meaning associations (urgency of a message, and sender identity) and using a controllable device for the scent-delivery. Here we re-validate the effectiveness of smell as notification modality and present findings on the performance of smell in conveying information. In a user study composed of two sessions, we compared the effectiveness of visual, olfactory, and combined visual-olfactory notifications in a messaging application. We demonstrated that olfactory notifications improve users’ confidence and performance in identifying the urgency level of a message, with the same reaction time and disruption levels as for visual notifications. We discuss the design implications and opportunities for future work in the domain of multimodal interactions.

I Smell Trouble: Using Multiple Scents To Convey Driving-Relevant Information.

Dmitrijs Dmitrenko, Emanuela Maggioni, and Marianna Obrist. 2018. In proceedings of the 20th ACM International Conference on Multimodal Interaction (ICMI ’18).New York, NY, USA. [DOI]

Cars provide drivers with task-related information (e.g. “Fill gas”) mainly using visual and auditory stimuli. However, those stimuli may distract or overwhelm the driver, causing unnecessary stress. Here, we propose olfactory stimulation as a novel feedback modality to support the perception of visual notifications, reducing the visual demand of the driver. Based on previous research, we explore the application of the scents of lavender, peppermint, and lemon to convey three driving-relevant messages (i.e. “Slow down”, “Short inter-vehicle distance”, “Lane departure”). Our paper is the first to demonstrate the application of olfactory conditioning in the context of driving and to explore how multiple olfactory notifications change the driving behaviour. Our findings demonstrate that olfactory notifications are perceived as less distracting, more comfortable, and more helpful than visual notifications. Drivers also make less driving mistakes when exposed to olfactory notifications. We discuss how these findings inform the design of future in-car user interfaces.

OSpace: Towards a Systematic Exploration of Olfactory Interaction Spaces.

Dmitrijs Dmitrenko, Emanuela Maggioni, Marianna Obrist. In proceedings of the ACM International Conference on Interactive Surfaces and Spaces (ISS 2017). Brighton, United Kingdom [DOI]

When designing olfactory interfaces, HCI researchers and practitioners have to carefully consider a number of issues related to the scent delivery, detection, and lingering. These are just a few of the problems to deal with. We present OSpace – an approach for designing, building, and exploring an olfactory interaction space. Our paper is the first to explore in detail not only the scent-delivery parameters but also the air extraction issues. We conducted a user study to demonstrate how the scent detection/lingering times can be acquired under different air extraction conditions, and how the impact of scent type, dilution, and intensity can be investigated. Results show that with our setup, the scents can be perceived by the user within ten seconds and it takes less than nine seconds for the scents to disappear, both when the extraction is on and off. We discuss the practical application of these results for HCI.

Smelling the Space Around Us: Odor Pleasantness Shifts Visuospatial Attention in Humans.

Luca Rinaldi, Emanuela Maggioni, Nadia Olivero, Angelo Maravita & Luisa Girelli. In Emotions, 5, 1-5 (2017). [DOI]

The prompt recognition of pleasant and unpleasant odors is a crucial regulatory and adaptive need of humans. Reactive answers to unpleasant odors ensure survival in many threatening situations. Notably, although humans typically react to certain odors by modulating their distance from the olfactory source, the effect of odor pleasantness over the orienting of visuospatial attention is still unknown. To address this issue, we first trained participants to associate visual shapes with pleasant and unpleasant odors, and then we assessed the impact of this association on a visuospatial task. Results showed that the use of trained shapes as flankers modulates performance in a line bisection task. Specifically, it was found that the estimated midpoint was shifted away from the visual shape associated with the unpleasant odor, whereas it was moved toward the shape associated with the pleasant odor. This finding demonstrates that odor pleasantness selectively shifts human attention in the surrounding space. (PsycINFO Database Record (c) 2018 APA, all rights reserved)

Special issue: Multisensory human–computer interaction.

Marianna Obrist, Nimesha Ranasinghe, and Charles Spence. 2017. In proceedings of the International Journal of Human-Computer Studies. [DOI]

Everyday, in their real-world interactions, people utilize their senses (all of them) and various facial and bodily expressions. For example, even a simple experience such as having a coffee with a friend involves multiple sensory information and input/output (I/O) channels such as smell, taste, vision, haptics, and sound. Thus, information from these very different sensory channels is combined in order to create compelling experiences and memories. However, currently, our interactions with technology are dominated by visual, auditory, and, to a lesser extent, tactile interfaces. The chemical senses (i.e., smell, taste, and the trigeminal sense) are still mostly neglected, often treated as ‘lower’, or somehow more primitive, senses with seemingly little to add to the field of human–computer interaction. However, given the immediacy of touch and the ubiquity of taste and smell, not to mention their importance to health, safety, work, leisure, pleasure, and a person’s sense of emotional well-being, future multisensory experiences with interactive technologies could potentially have a major impact on society and consumer markets, creating entirely new products, technology, and service opportunities. More importantly, multisensory experience research promises to deliver a step-change in our understanding of the human senses as interaction modalities and potentially also revolutionize existing interaction paradigms within the field of human–computer interaction(HCI).

Multisensory Experiences in HCI.

Marianna Obrist, Elia Gatti, Emanuela Maggioni, Chi Thanh Vi and Carlos Velasco. 2017. In IEEE MultiMedia, vol. 24, no. 2, pp. 9-13, Apr.-June 2017. [DOI]

The use of vision and audition for interaction dominated the field of human-computer interaction (HCI) for decades, despite the fact that nature has provided us with many more senses for perceiving and interacting with the world around us. Recently, HCI researchers have started trying to capitalize on touch, taste, and smell when designing interactive tasks, especially in gaming, multimedia, and art environments. Here we provide a snapshot of our research into touch, taste, and smell, which we’re carrying out at the Sussex Computer-Human Interaction (SCHI—pronounced “sky”) Lab at the University of Sussex in Brighton, UK.