Although consumers have become very familiar with speech recognition, current discussions of the metaverse and augmented reality (XR) mostly ignore the relevance of speech and sound that can be played in virtual or augmented environments. After all, most people are almost constantly interacting with their environment to create and identify sounds.
That’s a big omission. In fact, as the quest for an immersive metaverse experience continues, the goal of true immersion will only be fulfilled when virtual landscapes or elements can provide a portfolio of all the sensations that allow users to become one with these environments.
Siri has become one of the most popular voice assistants since Apple introduced the feature in 2011. The technology dates back to an international SRI project in 2003, which the research institute spun off as an independent company in 2007. Apple acquired the company in 2010 and since then Amazon.com introduced Alexa and Google created its Google Assistant.
Microsoft has its own speech recognition and in March 2022 it acquired Nuance Communications, a provider of speech recognition and artificial intelligence (AI) technologies. Coincidentally, Nuance partially harkens back to another spin-off, SRI International, which was acquired in 2005 before it became part of Microsoft.
Sound can become a story
But sound can be more, and film and game developers have shown that sound can become a story – whether it’s the ticking of a watch or the almost subliminal bass that echoes a heartbeat, not to mention explosions and the like.
Flexound has developed technology that can be built into theater or gaming chairs, as well as cushions or even furniture. The system enables the creation of personal sound spheres and allows users to feel the sound through their bodies. Meanwhile, Sennheiser developed the immersive sound that Netflix now uses in some of its movies and select scenes. Sennheiser has also created AMBEO, which offers surround sound for a wide range of applications, including cinematic virtual reality.
Many companies also already offer services for designing meaningful audio environments for niche applications – even if their current focus is not on XR or metaverse-related applications. These efforts range from fairly complex services to well-targeted applications.
For example, Spatial Inc develops soundscapes for retail stores, hospitality environments, office spaces and museums, among other places. Sen Sound, meanwhile, focuses on designing more pleasant and less stressful sounds for hospitals and healthcare facilities. It’s easy to see how such an understanding of design can find its way into creating more meaningful and immersive Augmented Reality (AR) and Virtual Reality (VR) applications.
The eloquent touch of haptics
Another set of interface technologies that already play a role in high-end gaming will help XR apps become more immersive and, in many cases, more inclusive and relevant. Haptics can enable use cases that are currently difficult to translate from real to virtual environments.
Haptics are divided into tactile and kinesthetic sensations. Tactile experiences relate primarily to the skin, such as the perception of texture, touch, pressure or vibration. Kinesthetic experiences relate to muscles, tendons and joints and the perception of weight and stretch, but also the movement of body parts. Various interfaces already exist and will be developed to address these types of perceptions.
Perhaps the most beneficial will be the feeling of haptics in the hands. Holding a tool, squeezing an object, or feeling a surface in virtual environments will increase the authenticity of the experience if tactile feedback is added. At the most basic level, haptic feedback will allow the use of virtual representations of standard interfaces, such as buttons that can be pressed or rotary controls that provide a certain clicking sensation.
A number of companies offer gloves as an interface. HaptX offers both types of feedback and includes motion tracking for VR applications or telerobotic operations. SenseGlove offers an advanced solution that allows users to get an idea of the size, density and durability of objects in virtuality. Finger and hand sensations for manipulating and exploring objects in VR offer obvious advantages.
So it’s no surprise that Facebook Reality Lab is experimenting with haptic gloves to understand their capabilities. Other companies are trying to achieve similar feelings without having to put on a glove. Ultraleap is one of them and uses ultrasound to project haptic sensations onto the hand.
These different approaches find use in different environments and for different use cases. Ultrasound can be more easily used in public spaces and for AR, where having to put on gloves can cause friction in the experience. Meanwhile, physical gloves can create more varied sensations and represent specific objects more accurately.
To create a complete sense of immersion, entire suits can offer immersive sensations. To fully integrate with virtual environments for entertainment, training, diagnostic and therapeutic applications, systems that can provide haptic sensations to the upper body or the whole body are welcome additions. One is bHaptics’ line of so-called TactSuit products, essentially vests that incorporate haptic feedback scattered throughout the body. The company also offers haptic gloves.
Teslasuit, meanwhile, offers a set of haptic clothing: the Teslaglove and the Teslasuit, which consists of a jacket and pants. Here, haptic sensations are based on electrostimulation.
A more immersive virtual environment
Scientists are also experimenting with different approaches to creating haptic sensations. A group of researchers at the University of Chicago claim to have “identified five chemicals that can cause lasting tactile sensations: tingling (sanshool), numbing (lidocaine), stinging (cinnamaldehyde), heating (capsaicin), and cooling (menthol).” These haptic devices include a case that surrounds part of the forearm and a strip that can be applied under the visual headset on the user’s cheeks.
Perhaps an attachable accessory for mass-market headsets—much like the University of Chicago researchers use their band in conjunction with headsets—could become a niche market that gives users specific sensations. The Feelreal, which started as a Kickstarter campaign, is a device that looks like a shield and attaches to the bottom of visual headsets. The device can provide the sensation of a cool breeze, warmth, water sprinkling and a wide range of scents.
Meanwhile, OVR technology offers devices that can be attached to VR headsets and provide a wide range of scents. The company mentions use cases like meditation and reaction training for the device. Exploiting the sense of smell of VR users should come naturally to developers trying to create more immersive virtual environments. OVR CEO Aaron Wisniewski says, “Metaversion without fragrance would be like living life in black and white.”
Such accessories can provide a wide range of interaction options. A research team at the Salzburg University of Applied Sciences has developed AirRes, an accessory for the Meta Quest 2 headset. The device looks like a gas mask, which the researchers offer as a breathing interface. It uses the wearer’s breathing as input through a resistance valve to allow interaction with virtual breathing tools, such as blowing a birthday cake or using a blowtorch to propel projectiles.
The resistance device can also restrict the user’s breathing, simulating entering a smoke-filled room, for example.
The applications in game or emergency training are obvious. Some of the applications presented by the researchers point to the potential of the device to enhance the virtual environment with a new level of authenticity. A light breath on mirrors that fogs them up can reveal hidden numbers left behind, much like children leave messages in mystery games.
Such fogging of glass or metal surfaces can create a sense of realistic interaction with objects in the environment and offers the potential to leave messages to other players on virtual windows or metal surfaces, for example for gaming purposes.
Meanwhile, H2L Technologies has developed a bracelet that can cause pain through small electric shocks. The company’s CEO, Emi Tamaki, says, “Feeling pain allows us to transform a metaverse world into a real world with increased feelings of presence and immersion.” Although the device can provide a sense of pain, its main purpose is to create a sense of resistance and weight when interacting with objects in VR .
Finally, brain interfaces could be the ultimate connection to virtual worlds and augmented landscapes. Such connections to the XR environment are currently speculative, but as time goes on and neuroscience advances, relatively simple applications will become conceivable, and companies are already exploring related technologies.
Meta Platforms’ Reality Labs focuses on the use of brain-computer interface (BCI) for AR glasses, especially for the use of BCI in communication applications. Others also see advantages in combining AR glasses and brain interfaces. In March 2022, Snap acquired NextMind, the developer of BCI. Snap is likely exploring the use of such an interface with its AR smart glasses, Snap Spectacles. Potential uses of the interface could include, for example, gaming or operating devices.
Meanwhile, Elon Musk’s Neuralink works more generally to develop interfaces that enable “direct connections between the brain and everyday technology.” In 2021, the company published information about a macaque monkey’s ability to move a cursor on a screen and play the video game Pong through brain activity alone.
Currently, many developments in XR and the metaverse are experimental, and interface technologies are part of such experimentation. Truly immersive environments will require advanced interfaces that can replicate authentic representations of the real world. But clunky and difficult-to-use interfaces can even prevent users from immersing themselves in VR as well.
Another factor is cost, and security will play an increasingly important role. However, VR applications will lead to a search for interfaces that enable more complex interactions with virtual objects and worlds – and advanced interfaces will enable increasingly sophisticated XR applications.
Martin Schwirn is the author of Small Data, Big Disruptions: How to Spot Signals of Change and Manage Uncertainty (ISBN 9781632651921). He is also a senior advisor for strategic foresight at Business Finland, helping startups and established companies find their position in tomorrow’s market.