Apple’s headset could redefine what being in the metaverse means

The release of Apple’s new mixed-reality headset, Vision Pro, could cause a seismic shift in how users will experience the metaverse, with developers potentially moving away from the absolute isolation of virtual reality.

Unlike today’s virtual reality headsets, which center on full immersion, Apple’s Vision Pro — unveiled on June 5 — can also superimpose applications onto the real world, letting users “interact with digital content in a way that feels like it is physically present in their space.”

Speaking to Cointelegraph, KPMG’s Head of Metaverse Alyse Su believes the Vision Pro will shift developer focus away from purely immersive virtual worlds.

The headset introduces a new technology it calls “EyeSight,” which uses lens trickery to make the user’s facial expressions look natural to outsiders. EyeSight also allows the display to switch between a transparent and opaque view, depending on whether a user is consuming immersive content or interacting with people in the real world.

“With the traditional or other headsets, there’s this barrier between people who are wearing it and people who aren’t. It feels like you’re in two different worlds,” she said. “Now there’s very few barriers between people, so you can have relatively seamless interactions.”

Su said there is also a lot of potential in its eye-tracking technology, which can be used to help create personalized experiences.

Apple’s pupil-tracking technology works by detecting the mental state of users based on data from their eye movements and the response of their pupils to stimulus. It then uses artificial intelligence to make predictions about their emotions.

“They’ve incorporated a lot of neuroscience or neuro tech research into this headset. The most overlooked part is the predictive pupil dilation tracking technology, which is based on their years of neurological research,” said Su.

Su predicted that the Vision Pro will steer developers towards utilizing “emerging fields such as neuroscience and generative AI to create more personalized and predictive experiences.”

Peter Xing, the founder of blockchain-based project Transhuman Coin also praised the headset’s design for “integrating with the natural way we interact as humans” and pointed to its unique eye-tracking capabilities as one of the bigger leaps forward for the metaverse.

“By detecting pupil dilations, the headset is acting as a proto-brain-computer interface to pick up when a user expects something to be selected to pre-empt what they’re thinking.”

Related: Animoca still bullish on blockchain games, awaits license for metaverse fund

When asked if the Vision Pro could put a spring back into the step of a struggling metaverse industry — which has seen almost all of the blockchain-based virtual worlds suffer losses of more than 90% in their native tokens — Xing wasn’t overly hopeful, at least not in the short-term.

He explained that it’s highly unlikely that Apple would encourage decentralized approaches that could threaten its “lucrative walled garden.”

While he, and many others, noted the distinct lack of a gaming focus in the product release, Xing believes the recent Apple’s partnership between Disney and Marvel could see a wellspring of games and other interactive experiences brought into the fold.

Xing believes this is exactly what the metaverse needs to go from the “gamer-centric world” to the mainstream.

AI Eye: AI travel booking hilariously bad, 3 weird uses for ChatGPT, crypto plugins

Source