What Will the Metaverse Smell Like?
The metaverse is expanding by the minute, and speculation abounds as to what each of us might want to do there. Attend virtual parties? Catch a virtual wave? Buy and furnish a virtual home? What remains nebulous, however, is what it will actually be like to do these things. What exactly will virtual experiences sound like, or feel like, or taste like? Dare we ask … what might they smell like?
Jas Brooks, a Ph.D. student in the department of computer science at the University of Chicago’s Human Computer Integration Lab, is part of a growing group of researchers who have taken up the latter question as one of the drivers of their work. Over the past four years, Brooks has been studying chemical interfaces—a class of devices that emit chemicals which alter users’ sense-based experiences—and how they might be used in both virtual and augmented reality.
Central to Brooks’s research is the trigeminal nerve, a bundle of fibers with endings in the nostrils that detect physical temperature shifts as well as interpret certain chemicals—such as menthol and capsaicin (the active component of chili peppers)—as coolness or warmth. They and their team have engineered devices which interact with the nerve in a variety of ways. One project involves a wearable thermal display—a VR headset with a contraption that extends below the nose—that creates the illusion of varying temperatures by stimulating the user’s trigeminal nerve with custom scent chemicals, diffused directly into the nostrils. Upon entering the virtual setting, users find themselves in a cabin in a wooded area blanketed by snow. As they move around the house, which is heated by a furnace, the device introduces a capsaicin-based scent that, as it’s breathed in, generates a feeling of warmth. When the wearer steps outside into the cold, the device emits a mint-scented chemical, eucalyptol, which evokes the chill of an icy mountain. The device could be a key tool in reproducing immersive, climate-controlled environments in the metaverse.
Most recently, Brooks helped engineer a “stereo-smell” device, made to be worn across the septum like a snore-stopper. It transmits, via Bluetooth, varying electrical pulses to the trigeminal nerve to enable a user to perceive the direction from which a gas, such as methane, is emanating. Brooks proposes that this device could not only contribute to the three-dimensional scent experience of the metaverse, but also augment our existing reality by helping users to, for example, locate a gas leak in their homes.
We recently spoke with Brooks to learn more about their research and its potential implications, both inside and outside of the metaverse.
What led to your interest in how senses might be experienced in virtual worlds?
I initially got into this area of research because of personal experiences with odor that had huge impacts on me. For instance, moments when I was walking down the street and was suddenly hit by an odor that was passing by. I’d just stop and freeze and think, Wow, that reminded me of something from years ago. Or I’d wonder where that odor came from, and walk around, trying to find it. Those kinds of experiences really highlight the sensuality of odor that keeps me coming back. I’m really interested in smell itself as a modality. It’s so captivating and evocative.
How do you see scent playing a role in virtual and augmented reality?
Smell has always been thought of for its entertainment value, or marketability. So in virtual reality, it could be used to interact with perfumes or with foods. It could also be valuable on the educational side of things. It could help with sensory education (learning to appreciate different food smells), heritage conservation (preserving 3-D smell experiences from history), ambient screenings (such as checking if you potentially have Covid-19 through existing VR experiences), or odor training (training a user to find the source of an odor). In terms of the latter, there are two researchers, Simon Niedenthal and Jonas Olofsson, who did really interesting work on wine tasting, and trying to train participants to perceive certain blends of odors, and telling what components are part of a particular wine. That context is fantastic for VR, because it provides the context that’s necessary—the wine and a standardized environment—for people to actually train themselves. It’s situations where the odor is critical for the actual activity.
But for me, I don’t see much of a separation between scent’s uses in virtual versus augmented reality. The challenge with virtual reality is that you have to essentially reproduce the full smell experience. It’s the same as for vision, where you have to block out our vision completely, and then have a computer powerful enough to create a 3-D simulation for you to navigate. With smell, that becomes a challenge, because you have a limited space for reservoirs, or fragrance sources. And there’s no equivalent to RGB for smell—we don’t have a super tiny set of odor molecules that can produce the entire landscape of smells.
That’s why I’m really excited about smell in augmented reality in particular, because in augmented reality, smell isn’t so much a limitation as something we can play with. We have such rich odors in everyday life—like, you microwave something and open it, and it’s just got this puff of smells that are super tantalizing—that we can’t easily reproduce, but we can change how we interact with them. And you don’t need every odor molecule in the world to change how you perceive odors that are already in that space. So, you could potentially do things with our stereo-smell project with an odor in the space that already exists, and change how you’re interacting with it by adding something on top or by decreasing things. This is the future work that we’re hoping to do.
Can you describe what it's like to use the products themselves?
Because it’s university research stuff, it’s all prototypes—so it’s never the perfect, final version. The temperature-simulating device wasn’t the best-engineered device, because I’m by no means the world’s best engineer. But there were no issues with comfort.
For the stereo-smell device, the design itself could definitely be slimmer. It’s already tiny for what it is, because you can fit it in your nose and still breathe comfortably. But it could easily become one-third or one-fourth of its size, if engineers were put on that project for longer. Comfort-wise, the people who tried it were mostly okay with it. The biggest limitation is probably batteries, which has always been a limitation in research. Maybe one day, you could make a version of it that’s self-powered by your breathing. That would be amazing. As for the quality of the smells themselves, there were simulations that were very strong, and as you can imagine, people were saying, “Oh, I don’t really want to follow this super-strong smell!”
As for the quality of the smell experience itself, the descriptions from our test subjects were along the lines of bubbly, vinegary, wasabi, refreshing, and warm. So it was a mix of tactile and smell perceptions, which is to be expected, because the nerve that we’re stimulating perceives both.
You’ve suggested that these smell technologies might have uses beyond virtual or augmented reality. What are some other potential applications of the devices you and your team have developed?
For the stereo-smell project, what’s exciting is that we’re stimulating one of the nerves that’s usually retained even when someone experiences smell loss, or anosmia. Most people with anosmia lose function of the olfactory bulb, but they may retain a dulled trigeminal nerve sensation. On one end of the spectrum, you can put in a super invasive olfactory cochlear implant—a device that would stimulate the olfactory bulb to restore a user’s sense of smell—but that is still a ways away from being a reality.
So one of the things I’m curious about is, can we tap into that nerve and offer a non-invasive option that’s kind of like a hearing aid, but for smell? A smelling aid? It wouldn’t recover every odor the wearer experienced in the past, but it would at least help them detect things like a gas leak, or give food more nuance. It’s also fascinating to think of the technology’s potential therapeutic uses, like for PTSD, for instance. Odor is such a strong experience for personal memory. This technology can be useful either for recalling the experience, or helping you to get over that recall.
On the art side, it’s really about providing tools for people to produce new smell experiences. With a lot of previous olfactory experiences, it’s difficult to produce a controlled odor in the air, because air is so turbulent. The stereo-smell device would be able to render a more sculpted odor stream in the air that you could interact with. One artist, Maki Ueda, for example, does a lot of work with navigating smell mazes. She hangs scented pendants from the ceiling, and you’re supposed to choose what direction to go in based on the different smells of the pendants. What would it mean to reproduce that same experience but where there’s nothing in the room, except for the odors being presented? That’s interesting to think about. There’s also movies. The potential is pretty endless.
You also conduct historical research related to smell. What are you working on in this area now?
I’m working with my collaborator Tammy Burnstock, an Australian documentary filmmaker and olfactory artist, as well as Arizona State University professor Christy Spackman and graduate student Lauryn Mannigel, on a pretty fun project right now. We have the last existing Smell-O-Vision device in our library, from 1960. It failed at the time, infamously, but it’s created a kind of haunting in the smell technology world, where everything gets compared to it. It’s gotten kind of a bad rap in some sense, because when you read about the experience, and you interview people that actually experienced it, you learn that it had limitations—but it was also full of wonder. We’re trying to see if we can either conserve or restore the device, and potentially curate a series of scented films around it.
We have two other similar projects. There’s one that’s on AromaRama, which was the competing device at the time. I think I found a theater that still has the device, assuming that it hasn’t been renovated. No one really knows how the device worked, so I’ve been reading a lot of newspaper reviews and things like that, trying to understand what the experience was, how it worked, and what happened to all of the people associated with the device.
The other one is with my collaborator Simon Niedenthal from Malmö University in Sweden. We’re looking at DigiScents and other scent technologies companies from the 1990s and 2000s that said they would be able to generate any odor a user could want—for games, and things like that. It didn’t work, either. We’ve been interviewing a bunch of people who were associated with these devices from the early 2000s, to try to understand their thinking and findings. What was the context at the time for them to actually try to produce these kinds of devices? Because, oftentimes, these kinds of devices are dismissed as gimmicks. And in some sense they are gimmicks—but so was any kind of technology, until it hit the perfect experience for people to realize that it had potential. It’s really about just going back and having a more nuanced understanding of what these devices were, and to potentially inform how we should be moving forward.