Apple Vision ProReviews

It’s easy to see (pun intended) why the Apple Vision Pro’s Persona feature is in beta

… though the EyeSight feature is slightly better.

This is my Vision Pro Persona. Please note that I'm neither a serial killer or a ghost.

The Apple Vision Pro’s Persona feature is usable but is described as a “beta,” and it’s easy to see why. The created personas are kinda creepy.

Persona is basically a lifelike avatar. If you’re wearing a Vision Pro while on FaceTime, you’ll appear as your Persona, while others joining from a Mac, iPad, or iPhone will appear in a tile. 

Apple describes Persona as “an authentic spatial representation of an Apple Vision Pro user that enables others on a call to see their facial expressions and hand movements — all in real time.” Using machine learning techniques, a Persona can purportedly be created in just minutes using Vision Pro. Personas also work in third-party videoconferencing apps including Zoom, Cisco Webex, and Microsoft Teams.

You create a Persona by selecting an option in the settings menu and then removing the headset and following screens on the external display. It asks you to look up, look down, look left, look right, smile, smile with teeth, and close your eyes. Then, in seconds, it creates a 3D Persona.

Once that’s done and you launch a FaceTime call, you see a clear video of the person you’re calling on a screen in front of you. What they see is a 3D-rendered version of you. And that version is, in my experience, what you might look like as a zombie. 

Personas extend to the Vision Pro’s outer display, allowing others to view a digital recreation of your eyes. This feature is called EyeSight.

EyeSight that, in Apple’s words, “helps users stay connected with those around them.” When a person approaches someone wearing Vision Pro, the device feels transparent — letting the user see them while also displaying the user’s eyes. When a user is immersed in an environment or using an app, EyeSight gives visual cues to others about what the user is focused on.

Here’s Apple’s description of EyeSight: “When a person approaches someone wearing Vision Pro, the device feels transparent — letting the user see them while also displaying the user’s eyes. When a user is immersed in an environment or using an app, EyeSight gives visual cues to others about what the user is focused on.”

In other words, when I blink my Personal blinks. When I look at an app a blushes light shows where my attention is. However, as The Verge pointed out, the social cues of this thing are going to take a long while to sort out. 

“… it’s strange to wear the headset and not actually know what’s happening on that front display — to not really have a sense of your appearance,” Victoria Song writes for The Verge. “And it’s even stranger that looking at people in the real world can cause them to appear, apparition-like, in the virtual world.”

However, the EyeSight feature is so dim and the Vision Pro’s cover glass is so reflective, it’s hard to see in most normal to bright lighting. When people can see your eyes, it’s a low-res, ghostly image that feels like bad CGI in a movie. What’s more, visionOS doesn’t provide any indicators so you can know exactly what other folks are seeing. 

Dennis Sellers
the authorDennis Sellers
Dennis Sellers is the editor/publisher of Apple World Today. He’s been an “Apple journalist” since 1995 (starting with the first big Apple news site, MacCentral). He loves to read, run, play sports, and watch movies.

1 Comment

  • Seems like a generic avatar could be shown indicating reactions. Smile, neutral, others. Cameras inside can determine the reaction, but to simplify the design, could show reactions as avatar instead.

Comments are closed.