{{item.title}}
The metaverse promises an augmented and virtual reality, but what about the intricacies of the physical world?
Sensory tech, from haptics and electrostimulation to 8k vision and spatial audio, promise to entice all five senses into the metaverse.
Sensory experiences will provide a greater sense of immersion in virtual reality, improving customer experiences and business outcomes.
Virtual reality has been promising immersive experiences for decades, but it’s only recently, with cloud, 5G and a host of other key converging technologies, that it has come close to delivering on its initial hype.
The metaverse will be a natural extension of VR’s success. A layering onto, and expansion of, our current reality (that’s the ‘meta’) in the universe.
The metaverse will not take place entirely in a VR headset – though that’s a common misconception. Instead, our lives will become phygital – a merge of the digital and the physical. And that physicality, those sensory experiences, could bring a whole reality to the metaverse.
At the moment, experiencing all five main senses – sight, touch, sound, taste and smell – is firmly a physical world phenomenon. But a range of innovative technology aims to change that; here are a few examples of what’s to come.
With a focus on both good content and good graphics, ‘seeing’ AR and VR experiences is already better than ever before.* In the future, graphics will continue to improve, with 5G/6G and cloud enabling 8k or even 16k photorealism and wide fields of view as standard in VR headsets – without the need for a beefy computer to power them.1 Additional cameras promise to capture user expressions to translate them through to avatars and increase communication between users.2
Increased interactivity with the physical world, such as colour ‘passthroughs’ (allowing you to see and interact with your room beyond the headset) will allow better integration, and decrease the feeling of two separate realities. Pancake lenses and eye-tracking will potentially make VR headsets smaller, lighter, and easier to navigate.3
Augmented reality glasses are being developed by tech players such as Apple, Meta (and Ray-Ban), Snap and Google (Google Glass are still being manufactured after a pivot to enterprise use-cases). Augmented reality contact lenses exist.4 And new technology is promising interfaces such as micro LEDs, more comfortable (smaller) headsets, better battery life and, so the forecast goes, a replacement to smartphones.5
Touch adds a critical dimension to the virtual feeling grounded, and it can have a direct impact on business outcomes such as brand affinity and purchase intention.6
Haptic vibration is standard in VR controllers, but it is increasingly also being used in wearables, such as VR gloves and suits. While motor-driven vibration is obviously not the same as physically touching an object, even this small amount of feedback can increase the feeling of an item being ‘real’.
Another type of touch simulation, force feedback, combines soft robotics and actuators, or pads, that inflate and deflate to create pressure. For example in a haptic glove, actuators placed along the fingers and palm would inflate/deflate in line with the contours of a virtual object. In this way, a user picking up a virtual aluminium can could ‘feel’ its cylindrical shape, and appropriate resistance if they were to attempt to crush it. The addition of auditory feedback, such as crunching metal could further enhance the illusion.7
Electrostimulation, where electrical signals of varying amplitude, voltage and frequency are sent to electrodes within wearables, or on a user’s skin, are being used to trigger muscle and/or nerve response.8 This can produce a number of very real sensations, such as temperature, pain, grip, lift, pushing or even wind blowing against skin. In VR gaming, this can mimic being shot, stabbed or even the sensation of bleeding – adding a whole other level of reality to play.9
And if wearing a second skin doesn’t sound like something you’re interested in, ultrasound technology has also been used to create object resistance in mid-air – such as braille characters that users can read (feel) with their bare fingers.10,11
Think hearing-tech is just noise-cancellation and wireless earbuds? Not so! Bone-conduction speakers, for example, are already on the market as a feature in smart sunglasses, such as the aforementioned Ray-Bans or by companies such as Bose and Amazon. Bone conduction occurs when sound vibration is positioned to hit the ear via your skull. Unlike in traditional earphones where your outer ear is plugged up, you can still hear life around you and be immersed in both worlds at the same time.**
What about being able to communicate without speaking? MIT’s ‘AlterEgo’ prototype reads signals from the mouth’s muscles as a person forms soundless words and replies back via bone conduction in the wearable.12 In 2019, Microsoft patented an idea that would allow users to speak to their voice assistant in public through inhaled whispers.13 And there are a range of research projects currently underway into neural implants that could turn thought directly into speech, bypassing the mouth altogether.14
Spatial audio, or sound that is oriented in 360 degree space, allows users to hear in a more realistic manner (than stereo or mono). Here, sounds come from a variety of directions and with additional information – such as proximity (volume), environment (muffled, echoey) and context (background noise). Combined with AI, spatial audio could create virtual environments where whispering to someone standing next to you would mean only they hear you, or where you could walk between groups in a virtual space and hear clearly only the chatter of those nearest.15,16
Sadly, technology is still a ways off of a virtual taste experience where, Willy Wonka style, you pop a piece of gum in your mouth and taste the flavour of an entire meal, but that doesn’t mean that there won’t be metaverse-related implications for the food industry.
In an augmented rather than the fully virtual sense, NFTs are already being used to change the physical dining experience, from dining reservation systems, exclusive restaurant access, to dynamic pricing of tables, meals or even limited ingredients (who wants the Fugu?).17 Burger King has used QR codes (and celebrities) to unlock collectible NFTs.18 McDonald’s is developing virtual restaurants for people to socialise in while ordering physical deliveries.19
Through a health and wellness lens, AR allows diners to peruse 3D models of their meals and see ingredient lists. Blockchain is being used in supply chains to track sustainability, produce, food safety and provenance, and marketing those credentials to the end user is a potential value-add. Health campaigns have been gamified in conjunction with supermarkets and popular online games.20 And if it really floats your boat, a lickable tv screen has been invented. No, really.21
In the physical world, web3 companies are already experimenting with NFTs, crowd-sourcing and fragrance.22 But what about in the virtual world – how do you create particular scents from thin air?
Companies are investigating. Snap-on cartridges, wearable emitters and stand-alone generators that combine molecules to release scents into an environment all exist in various states of development.23,24 While they have the potential to create dimension for media experiences, there are also use-cases in health such as detecting diseases or in therapeutics by increasing the effectiveness of mindfulness or VR-assisted pain relief.25
In consumer markets too, olfactory cues could be advantageous. Imagine entering a virtual store with the same ‘brand scent’ used in the corresponding physical store complete with your feelings of loyalty? And what of associated product scents – fresh rubber in a shoe shop, new car smell in a virtual second hand dealership, or musty paper in a bookshop – could they increase purchase intent?26 Perhaps the waft of a signature dish from a virtual fast food joint causes an uptick in online orders? Avatar-worn cologne?
In isolation, sense technologies can seem trivial, an attempt to make the virtual more ‘real’, enticing humans to spend time (and money) in digital spaces. However, much like the complexities of how people relate to sense in the offline world, the use-cases for its replication in a virtual one should not be underestimated.
Experience continues to be a defining factor for brands, a differentiator that strengthens emotional attachment and customer loyalty. Gaming, media, sports, and online events will all benefit from being more emotive and immersive. For industry, the addition of sense to AR and VR could be game-changing, from better training in high-risk roles such as defence, firefighting or aviation, using AR and haptics to ensure machine maintenance and factory floor safety, or improving the patient outcomes of remote surgery.
In short, it might be sooner than you think before you’re stopping to smell the virtual roses.
* One last barrier to overcome? Vestibular sense and motion sickness – and help is on the way, for instance in the form of eye-side light panels.27
** If you’ve ever wanted to place your finger to your ear and talk spy-style without an earbud? Smart rings can do it with bone-conduction tech.28
With thanks to Asanga Lokusooriya.
Get the latest in your inbox weekly. Sign up for the Digital Pulse newsletter.
Sign Up
Theme Enter theme here
References
© 2017 - 2024 PwC. All rights reserved. PwC refers to the PwC network and/or one or more of its member firms, each of which is a separate legal entity. Please see www.pwc.com/structure for further details. Liability limited by a scheme approved under Professional Standards Legislation.