The ChatGPT-powered app that helps blind people ‘see’ the world around them

You could argue that we’re still in the heady early days of an inevitable proliferation of ChatGPT apps, plug-ins and tools to automate pesky daily tasks.

But some bright minds are already finding interesting uses for OpenAI’s revolutionary artificial intelligence (AI) chatbot, aside from plagiarising essays, sending flowery emails and asking if God exists.

Take, for example, an app for visually impaired and blind people called Be My Eyes.

Founder Hans Jørgen Wiberg, who is visually impaired himself, developed the idea in 2012 as a way to get visual assistance when he was on his own, and is now drawing on the tech behind ChatGPT.

How does it work?

Launched in 2015, the app works by pairing visually impaired or blind people with sighted volunteers who act as their “eyes” via video call.

Now boasting half a million blind and low-vision users in addition to 6.4 million sighted volunteers, the app is now trialling a “Virtual Volunteer” powered by ChatGPT.

This means that users can take a picture using their phone, send it via the app and instantly have the photos described to them in impressive detail.

The virtual aid can even suggest recipes to prepare based on images of the contents of a user’s fridge.

"I use it on a fairly regular basis, maybe once or twice a week,” Jesper Holten, a Be My Eyes user who is completely blind, said.

“It tends to be in daily situations like when I'm cooking food and want to have a look at tins, especially, tins and cans with food stuff can be very hard to ascertain what's in it”.

Using ChatGPT for live video streaming

Wiberg says they were approached by OpenAI in early February to help launch the start-up’s new image-to-text generator.

"I have really had a hard time sleeping since this because I think there are so many possibilities where this could go,” he said.

One of the tantalising possibilities keeping him up at night is the possible integration of ChatGPT into live video streaming, potentially opening up a world of accessibility to people with poor vision.

Holten, in particular, is excited by the prospect of being able to explore new places independently.

"I want to have a level of confidence that I don't necessarily have in unfamiliar spaces, that if the AI technology can help me in gaining or regaining that level of confidence, that would really be something,” he said.

For now, the Virtual Volunteer is still in closed beta testing and is not available for widespread use.

Wiberg is also adamant that they’re not planning to ditch their sighted volunteers any time soon.

But developers are already excited by the possibilities.

"I think there are a lot of use cases where you may feel like you're burdening someone, for something that seems not important,” said Jesper Hvirring Henriksen, Be My Eyes’ CTO.

“There's lots of important use-cases and maybe I shouldn't take up the time of a volunteer with what seems like a silly little thing. Or maybe i ==t's just Monday morning, it's early in the morning. You don't feel like talking to a human being right now. In all those cases, you can just use AI now, and you're talking to a computer," he added.

For more on this story, watch the video in the media player above.