Strabismus and the Apple Vision Pro
Count me among the many people who returned their Apple Vision Pro before the initial window expired, but I do think I have a unique take I haven’t seen floating around the Internet yet. I won’t rehash the big themes because there are plenty of great reviews out there that already covered how impressive the technology is (it’s super cool!) and also many that covered the shortcomings (for me, it’s that you can’t drink a beverage without a straw and that the Mac desktop connection is too blurry to be usable as a work monitor.)
That said, I’ve had an eye condition called strabismus—an issue where your eyes are not aligned with each other when looking at an object—for all of my life and have had multiple surgeries to (unsuccessfully, so far!) try and fully correct it. Most of the time my eyes are fine, but my left one will start to drift off in a different direction if I’m physically or mentally tired. When this happens, I have no way of knowing; I really can’t tell unless I catch a glance of myself in the mirror or in a preview of my webcam video while I’m working on my computer. Unfortunately, there’s no alert to the brain that says anything like “Strabismus Mode: Activated!”
If I do notice, I can simply blink my eyes to refocus and bring both eyes back into alignment, but the problem is that I don’t know when to do that, so I was a little concerned about exactly what would happen with the eye tracking in Vision Pro. To be fair, Apple does warn that people with conditions like strabismus might not have a great time with Vision Pro:
Apple Vision Pro uses where a person looks to navigate within visionOS. Some medical conditions, such as those involving eyelid drooping, changes in eye alignment (including strabismus or lazy eye), or uncontrolled eye movements (including nystagmus) might make it difficult for Vision Pro to properly detect your eyes. This might impact the visual experience.
Still, I’m a sucker for technology so I decided to roll the dice and placed my order anyway. I had also read about the Accessibility setting that forces the eye tracking to only use a single eye, which I figured that would be a good fallback option if I ran into any issues.
To customize which eye you use to control Apple Vision Pro, go to Settings > Accessibility > Eye Input. You can choose Both Eyes, Left Eye Only, or Right Eye Only. Try this if you need to use one eye to control Apple Vision Pro input.
I really didn’t have any initial trouble and what I found absolutely delightful about Apple Vision Pro was that—for the first time in my life!—I was suddenly getting real-time feedback about if my left eye was aligned or not. Because the eye tracking requires both eyes to be looking at the exact same thing, I found that I would periodically be unable to get the cursor to settle on a specific visionOS UI element, and that would be a signal that my left eye was not in alignment with the right. Whenever this would happen I could quickly blink to refocus my eyes and then accurately settle on whatever I was trying to do.
What I thought was super cool was that I don’t have any equivalent type of feedback in the real world; my left eye will just appear to be looking a slightly different direction after a bit and it’s up to me to notice in a mirror or see it on the face of someone else’s reaction to me. But with Vision Pro, I could easily tell when this was happening. Now, let’s put up the usual disclaimers that I’m no doctor and I have no idea what kind of long-term side effects could happen from prolonged use, but I thought it was worth calling out as a potential benefit for anyone else with strabismus.
One last speculative thought: I’ve had various apps on my iPhone and iPad over the years that include training exercises to help work your eye muscles towards better alignment. The way these work is that you wear those traditional 3D glasses with the blue and red lenses, and the app changes the color of the items on the screen so that you can only see half of the elements with each eye. For example, in a Tetris-like game the falling blocks might be red while the blocks you’ve already placed are blue. In order to place a red block in the intended location, you need both eyes to work together so you can see both the blue and the red components. I haven’t seen any examples yet, but I’m wondering if there’s potential to create new apps like this for the Vision Pro where visionOS natively applies the color shading. It’s still an expensive proposition for potential improvement, but over time this could be a huge help for people with mild cases of strabismus.