AI headset from Apple redefining what it means to be in the Metaverse Web 3.0.
Apple’s headset could redefine what being in the metaverse means

Apple’s Vision Pro to Transform the Metaverse

The unveiling of Apple’s Vision Pro mixed-reality headset on June 5 could bring about a major change in how users experience the metaverse, with developers potentially moving away from the full isolation of virtual reality.

Unlike current virtual reality headsets, the Vision Pro offers the ability to superimpose apps onto the real world, allowing users to “interact with digital content in a way that feels like it is physically present in their space.”

Alyse Su, Head of Metaverse at KPMG, told Cointelegraph that the Vision Pro could cause developers to shift their focus away from purely immersive virtual worlds. The headset features a new technology called “EyeSight,” which uses lens trickery to make the user’s facial expressions look natural to others. EyeSight also enables the display to switch between a transparent and opaque view, depending on whether a user is consuming immersive content or interacting with people in the real world.

AI Technology in Apple’s Headset

“With the traditional or other headsets, there is a barrier between people who are wearing it and those who are not, as if they are in two different worlds,” said Su. “But with Apple’s headset, there are very few barriers between people, making for relatively seamless interactions.”

Su also pointed out the potential of the eye-tracking technology, which can be used to create personalized experiences. Apple’s pupil-tracking technology is based on data from eye movements and pupil response to stimulus, and uses AI to predict the user’s emotions.

“There has been a lot of research into neuro tech incorporated into this headset,” said Su. “The most overlooked part is the predictive pupil dilation tracking technology, which is based on their years of neurological research.”

The use of AI technology in headsets is just one example of how AI generator and are being used to create a web 3.0 metaverse, as seen in applications like Snapchat AI and Soundhound AI, as well as AI articles and AI reporter.

The Vision Pro and AI-Powered Metaverse

Su predicted that the Vision Pro will steer developers towards utilizing “emerging fields such as neuroscience and generative AI to create more personalized and predictive experiences.” Peter Xing, the founder of blockchain-based project Transhuman Coin, also praised the headset’s design for “integrating with the natural way we interact as humans” and pointed to its unique eye-tracking capabilities as one of the bigger leaps forward for the AI-powered metaverse, or Web 3.0.

When asked if the Vision Pro could put a spring back into the step of a struggling metaverse industry — which has seen almost all of the blockchain-based virtual worlds suffer losses of more than 90%in their native tokens— Xing wasn’t overly hopeful, at least not in the short-term. He explained that it’s highly unlikely that Apple would encourage decentralized approaches that could threaten its “lucrative walled garden.”

AI and the Metaverse

Xing believes that the recent partnership between Apple and Disney, as well as Marvel, could be the catalyst for a surge in gaming and interactive experiences, helping the metaverse move beyond its gamer-centric world and into the mainstream.

To achieve this, AI technology such as Fetch.AI, Snapchat AI, and SoundHound AI, could be used in combination with AI-based travel booking, crypto plugins, and other weird applications of ChatGPT.

These tools, when used properly, could help the metaverse become more accessible to all, and not just gamers.

Categorized in:

Tagged in: