Smell and HCI - Sensing the Digital World
This is the 5th article in my series exploring how Human–Computer Interaction (HCI) engages our senses. Smell bypasses conscious thought and connects directly to memory and emotion. It is powerful, personal, and still rare in digital experiences.
TL;DR
Smell bypasses conscious thought, triggering memory and emotion more directly than other senses.
Current applications range from VR therapy to medical diagnostics, but adoption remains limited by technical constraints.
Key barriers include physical delivery requirements, cultural subjectivity, and lack of standards.
Future opportunities lie in therapeutic, immersive, and accessibility applications that require careful ethical consideration.
HCI and Smell
Smell (or olfaction) is the process by which humans detect and interpret airborne chemicals. While often treated as secondary to vision or hearing in interface design, scent plays a central role in memory, emotion, and perception.
In human–computer interaction (HCI), olfactory interfaces are an emerging field focused on integrating smell into digital experiences (from immersive VR to therapeutic tools). Although still limited by technical constraints, the potential for emotional depth, realism, and inclusivity makes smell a sense worth exploring. Especially as designers begin layering sensory inputs like visual, auditory, and tactile, scent could become a missing link in creating truly immersive and emotionally intelligent systems.
Memory in a Breath
Think of a smell that takes you back. Not just to a general memory, but to a moment. A kitchen, a classroom, a person you miss. The kind of memory that arrives fully formed (unfiltered, emotional, immediate).
For me, it happened on a recent walk through a local state park. A sudden whiff of gloss paint drifted over from a building being renovated nearby.
In that moment, I wasn’t in the park anymore.
I was a child again, in a particular place, at a specific time. The memory didn’t just return, it took over. As if a time portal had cracked open and pulled me through.
That’s the strange power of scent. More than any other sense, it bypasses logic and reaches memory, meaning, and feeling directly.
And yet, it’s almost entirely missing from our digital lives. We design for sight, for sound, sometimes even for touch, but rarely for smell.
What would shift if we invited it in?
The Power of Smell
Smell is ancient. Biologically speaking, it predates sight, hearing, and language. Before we could name a threat or describe a memory, we could smell it.
Unlike our other senses, smell skips the brain’s usual processing center (the thalamus) and goes straight to the limbic system, the region responsible for memory and emotion. That’s why it can hit so hard, arrive fast, and linger long after other details fade.
But smell isn’t just instinctive, it’s shaped by culture and context.
A scent that signals comfort to one person might mean danger to another. What smells clean or delicious in one culture might suggest something very different elsewhere. Unlike sight or sound, which are often standardised in design, scent resists easy categorisation.
Even if we could agree on what a scent means, there’s another problem: reproducing it. Unlike light or sound, which can be digitised and transmitted with precision, smell is physical. It requires particles, hardware, and chemistry. Delivering it on demand (and making it stop) is technically challenging.
That combination of subjectivity and complexity helps explain why smell is often left out of our digital tools and experiences.
But people don’t stop smelling. Maybe our systems shouldn’t, either.
Olfactory Interfaces in HCI
In Human–Computer Interaction (HCI), the study of scent-based technology falls under Olfactory Interfaces. While still emerging, this area explores how scent can trigger emotion, enhance immersion, support therapeutic use, or signal contextual cues.
This includes:
Scent-Emitting Devices: systems that release smell in coordination with media, feedback, or interaction
Olfactory Displays: tools that encode and present information using scent rather than visuals or audio
Multisensory VR: virtual environments enhanced with smell to increase realism or emotional response
Electronic Noses (E-noses): devices that detect chemical patterns in air for diagnostics or input sensing
Affective Interfaces: scent used to evoke mood, memory, or atmosphere
Smell Capabilities
Capability | Description |
---|---|
Odour Detection | Sensing airborne chemical molecules, even at extremely low concentrations |
Discrimination | Distinguishing between similar but distinct smells, such as citrus vs floral |
Adaptation | Adjusting sensitivity after prolonged exposure, often making a scent fade from awareness |
Associative Memory | Linking scent to emotional or episodic memory, often involuntarily |
Cultural Framing | Interpreting scent through learned context (e.g., what smells ‘clean’ or ‘edible’) |
Current Applications
While scent hasn’t gone mainstream in tech, it hasn’t been completely ignored. In research labs and early-stage ventures, smell is beginning to find its way into digital contexts.
Scent-emitting devices: Tools like Aromajoin (Japan), oNotes, and others have explored scent cartridges triggered by media. These devices try to align scent with visual or auditory content.
Virtual reality therapy: In exposure therapy for PTSD or anxiety, adding scent can make environments feel more real (or more safely controlled). Research by institutions like USC’s Institute for Creative Technologies has explored this in clinical VR settings.
Medical diagnostics: Electronic noses use chemical sensors to detect markers in breath or sweat. Some research (such as MIT’s and the European Commission’s SmellData project) links this to early detection of conditions like cancer or infections.
Retail and branding: Physical environments have long used scent to shape perception. Some organisations are exploring how those cues could carry into digital or hybrid experiences.
These applications are niche for now, but they hint at something broader: smell may be underutilised, but it is not irrelevant.
Use Cases
Smell already plays a role in how we move through the world. In carefully chosen moments, it could enhance how we navigate digital ones.
Therapy and emotional regulation: Used alongside mindfulness apps or in clinical care, scent could support calm, focus, or recovery.
Immersive storytelling: In VR or spatial media, scent might be layered not for realism, but for emotional nuance or narrative depth.
Remote connection: Experimental devices have explored transmitting personal or environmental scents. Still early, but potentially meaningful.
Training and safety: In aviation or healthcare simulation, scent could support situational awareness or muscle memory.
Accessibility: For people with visual impairments, scent might offer ambient cues or feedback (if designed with choice and care).
Not every interface needs to smell. But when the goal is presence or emotion, scent might be part of the solution.
Challenges
Designing for smell isn’t just about creativity; it’s about constraints. And the barriers are both technical and human.
Delivery is physical: Scent requires matter. That means cartridges, dispersion controls, and real chemical compounds.
Hard to reset: Unlike audio or visuals, a scent can’t be paused or instantly cleared. It lingers.
Subjective and cultural: Smell is interpreted through personal, emotional, and cultural lenses. There’s no universally accepted neutral.
Device fatigue: Many scent interfaces struggle with cost, maintenance, and refill logistics.
Lack of shared standards: There’s no simple protocol for encoding or transmitting scent across platforms.
These constraints don’t make scent unusable, but they make it harder to scale, standardise, and sustain.
Ethics and Equity
Smell is subtle, but not neutral. Designing with it means considering its impact (intended or not).
Consent: People can’t un-smell something. Systems need clear opt-in mechanisms.
Manipulation: Scent has been used to influence behaviour in physical environments. That potential carries into digital ones.
Cultural assumptions: What smells clean or calming in one culture may feel invasive in another. Design must localise.
Accessibility: Some people can’t smell at all, while others are sensitive to scent. Alternatives and accommodations matter.
Privacy: As scent-tracking tech evolves, so do the risks of biometric or behavioural inference.
Smell can shape how we feel, decide, and remember. That power deserves both caution and care.
Future Trends
Smell is unlikely to become a standard input or output in most systems. But in specific contexts, it may quietly take hold.
In multisensory design, scent could be layered with visual, auditory, and haptic cues, not for realism alone, but for emotional depth. Designers working in VR, mindfulness, or interactive storytelling are already exploring this possibility.
Wearable scent tech is also evolving. Small, personal dispensers worn as pendants, clips, or wristbands could support wellbeing, alertness, or comfort in everyday settings.
AI may push us further into personalised scent experiences, matching olfactory cues to mood states, behavioural patterns, or biometric signals. That raises new questions about consent, manipulation, and digital scent profiling.
At the same time, researchers are working to build the foundations: defining scent vocabularies, encoding methods, and interface patterns that could eventually support broader adoption across platforms.
And for those who can’t smell, or who react sensitively to scent, we may see innovations in translation and substitution—ambient cues, contextual summaries, or multimodal alternatives that honour presence without relying on smell itself.
These trends aren’t inevitable, but for designers thinking about presence, emotion, or care, they offer a path worth watching.
Human vs HCI – Smell
Human Sense | HCI Equivalent |
---|---|
Bypasses conscious thought, evoking fast, emotional responses | Used in immersive VR and therapeutic contexts to trigger memory or calm |
Subjective and shaped by culture, memory, and context | Requires localisation and optionality to avoid misinterpretation |
Cannot be paused or removed easily, tends to linger | Hard to control or reset, often needs physical dispersal mechanisms |
Often unnoticed but highly influential | Often excluded in digital systems due to technical and perceptual bias |
Inaccessible for some (e.g., anosmia, allergies) | Must include opt-out choices and alternative sensory cues |
The One-Sided Memory Test
Try this while sitting quietly:
Think of a smell, something distinct. Freshly cut grass, a family recipe, a hallway you haven’t walked down in years.
Close your eyes, take a slow, deep breath, and continue thinking.
Notice what returns.
Not just the scent itself, but the fragments that come with it (a room, a voice, a season, a feeling).
These flashes of memory show how deeply scent is wired into our cognition. So what happens when we try to replicate that connection with technology?
Closing Thoughts
It may take a very long time for Smell to become a core design tool, but that doesn’t mean we should ignore it.
It’s a reminder that memory and emotion aren’t abstract. That presence is physical. That connection sometimes begins with a breath.
For teams designing systems people want to inhabit, scent offers something other interfaces can’t: a direct path to memory and meaning. That’s worth more than a moment’s consideration.
Next, we’ll explore how taste, our most intimate of senses. How it might complete the sensory picture, and what happens when all our senses converge in digital space.
Further Reading
For those interested in exploring the role of smell in digital experiences, these sources offer a deeper look into the science, design, and potential of olfactory interaction.
Soft, miniaturized, wireless olfactory interface for virtual reality by Liu et al. (2023)
Describes a breakthrough wearable olfactory interface with miniaturized odor generators that achieve millisecond-level response time and low power consumption for immersive VR/AR applications.
SMELL SPACE: Mapping out the Olfactory Design Space for Novel Interactions by Maggioni et al. (2021)
Establishes a comprehensive design framework with four key features—chemical, emotional, spatial, and temporal—that guide interaction design with practical examples.
Olfactory Virtual Reality: A New Frontier in the Treatment and Prevention of Posttraumatic Stress Disorder by Herz (2021)
Explores how olfactory VR may support PTSD treatment through desensitization, memory reappraisal, and prevention for at-risk populations.
The electronic nose in lung cancer diagnostics: a systematic review and meta-analysis by Kort et al. (2025)
Analyzes 35 studies covering 4,483 patients, showing that electronic nose devices can achieve 90% sensitivity and 89% specificity for noninvasive lung cancer detection.
An olfactory-based Brain-Computer Interface: electroencephalography changes during odor perception and discrimination by Morozova et al. (2023)
Identifies frontal theta modulation during odor perception, laying the groundwork for olfactory BCIs in treating anosmia and cognitive decline.