Could Apple’s Vision Pro Significantly Augment Sight For The Visually Impaired?

News Room

The use of electronic headsets deploying camera and display systems to improve vision for the close to one billion people around the world living with sight loss that cannot be corrected with regular glasses has been a long-held aspiration within the field of assistive technology.

Early variants of such technology have existed since the nineties but, in the three decades that have followed, a market-leading high-end device with the requisite blend of rich true-to-life optics and wearability for the user has failed to materialize.

The significant challenge of meaningfully replicating natural sight, particularly if the image then needs to be magnified, relies heavily on deploying highly advanced cameras and displays – the likes of which have simply not been available in either bespoke low vision systems or consumer virtual reality and augmented reality headsets to date.

This could well be about to change when Apple’s Vision Pro headset, unveiled in June, officially launches in the U.S. in early 2024. Though the price point of the $3,499 mixed reality spatial computer doesn’t exactly scream out “accessibility” the tech specs are certainly tantalizing for anyone with an interest in camera vision in relation to low vision rehabilitation.

The Vision Pro is chiefly marketed as a groundbreaking attempt to seamlessly blend physical and digital spaces offering up gigantic floating screens for work productivity or immersive entertainment, the capture and playback of 3D photos and videos and virtual avatar replicas for work meetings or FaceTime calls.

To achieve all this, the Cupertino-based tech giant will make use of dual micro-OLED displays with a combined 23 million pixels and a staggering and unprecedented 12 on-board cameras with passthrough images of the real world streaming over one billion color pixels per second in addition to advanced eye tracking capabilities. Just like with iPhones and iPads, the new visionOS will have its own extensive app store to enable specialist developers to take advantage of Vision Pro’s game-changing hardware.

Privacy concerns

Sadly, as ever in life, if something seems too good to be true – it usually is and as such, as of summer 2023, it would appear that there is a fly in the ointment. Apple has confirmed that, at launch, it will be restricting access to the Vision Pro’s impressive camera array for third-party app developers.

This doesn’t mean that the device won’t be accessible to individuals with sight loss. On the contrary, iOS has an excellent track record of offering advanced accessibility features such as zooming, text-to-speech and voice commands straight out of the box. However, these features solely apply to a device’s internal menus, systems and some apps.

Realistically, sight enhancement wearables aren’t really intended for interpreting textual information. They are primarily used as a hands-free option to enhance real-world scenes and experiences such as mass spectator events, enjoying nature and physical environments or engaging in certain activities with friends and family.

The reason behind the camera restrictions is a simple but important one – namely privacy. Apple is keen to see how the early adoption phase of its headset goes and how the public takes to it before considering opening up its full capabilities to third-party developers.

It speaks to the new age we live in where advanced technologies such as AI and camera vision hold great promise for transforming the lives of marginalized populations but, at the same time, wider society hasn’t quite made its mind up about the use or potential abuse of technologically-bestowed superpowers.

Being able to zoom in on or even record a wrinkle on the face of somebody standing across the street is one such superpower. Nonetheless, it’s worth noting that the very same technology would enable a visually impaired parent to see the expression on their child’s face when they attend their first school play.

The other oddity here is that, to a large extent, members of the public have been routinely filming one another for well over a decade due to the proliferation of smartphone cameras. Therefore, the stumbling block appears to be around becoming accustomed to the etiquette accompanying a new technological paradigm rather than a wholesale leap of faith.

Doing it for the users

One individual who has witnessed directly how smartphone cameras have revolutionized low vision rehabilitation in recent years is the CEO of ReBokeh Rebecca Rosenberg who lives with albinism which affects her sight. ReBokeh is an iOS app that boosts the zooming capacity of the iPhone’s camera and provides the user with novel ways of zooming and applying other sight enhancement filters.

Rosenberg says she hopes Apple doesn’t take too long to open up its new eyewear so its full potential can be realized for everyone.

“In the case of Apple locking down a lot of the functionality of the Vision Pro, that would be a bad outcome, not just for people with low vision, but for everyone. Because if they are locking down the cameras – what else might they be locking down? What else can’t you access?”

She continues, “With the Vision Pro, Apple has created a technology with the potential to be fantastic for people with low vision. Hopefully, Apple will not forget about the additional marginalized populations that they could serve with this device. There’s so much potential here for people who’ve been waiting decades for something like this.”

Dr. Bryan Wolynski is Chief Technology Officer at the Lighthouse Guild and has over two decades of experience as an optometrist specializing in low vision rehabilitation using assistive technology. Wolynski feels that, though Apple currently sits at the fulcrum of a new and innovative technology, it’s going to take a joint effort to realize its potential for low vision rehabilitation:

“At this point, we don’t know exactly how all those cameras or even the augmented reality overlay would work for somebody with low vision but we’re hopeful,” says Wolynski.

“The key will be to bring everyone together in the same space – not just engineers, programmers and coders imagining what people with vision loss might want to see. Perhaps Apple could bring in visually impaired coders as well as doctors and researchers but, most importantly, users living with sight loss to really figure this out.”

With Vision Pro not set to launch for a few months yet and with people with low vision recognizing its novelty – Apple will surely be granted a grace period for the dust to settle on the new device and for any early teething problems to be ironed out.

However, even now, the low vision community has its fingers crossed that amidst the hype and media clamor both Apple and the Vision Pro’s app developers are sufficiently aware of the device’s potential to augment real-world experiences for people with sight loss that are fundamentally about a lot more than just novel ways of interacting with a computer.

Read the full article here

Share this Article
Leave a comment