Apple Vision Pro Questions

From desktop, to clamshell laptop, to tablet/phone handheld; Apple is betting they can change, or at least, extend the form factor of personal computing again.

Questions I have:

How many monitors?

Vision Pro has the ability to act as a 4K monitor for an Apple computer. It seems reasonable that it will use AirPlay2, which limits the system to a single 4K monitor. If you were hoping to replicate your multi-monitor setup in a virtual world, this doesn’t seem likely.

I am curious as to how a single virtual monitor and a host of native apps play together?

Can I look at a monitor?

Assuming I have just one virtual monitor, but I also have my laptop with me. What will it be like to look through the Vision Pro at my laptop and to use the virtual monitor as a second?

Are the cameras and “see through” display good enough to use a real monitor in the real world next to a virtual monitor?

Point and click, where are my hot keys?

I hate mice. Point is click is great for easy use cases and for systems that you don’t use every day. Any power user relies on hotkeys for rapid control.

The Vision Pro seems to have some amazing eye motion driven point and click. But how will that scale for the power user. This seems to continue the trend of “dumbing” down interaction models to tablet’s lowest common denominator.

How will typing feel

For media consumption and more passive computing, this seems really immersive. But how close am I to Johnny Mnemonic? Can I tuck my laptop into a case and plug into a virtual UI that allows me to type, manipulate, and work?

FaceTime with digital puppets

In the FaceTime demo between two Vision Pro users, it uses an AI powered digital puppet to simulate your face. Given the short video, it seemed like it was in the uncanny valley with simulated lip and facial movement. I wonder how off-putting this will be?

Leave a comment