Be My Eyes is collaborating with OpenAI’s GPT-4 to improve accessibility for blind and low-vision people

Be My Eyes, the app for blind and low-vision people, announces a collaboration with OpenAI to develop a powerful new AI-powered Virtual Volunteer feature.
Be My Eyes is collaborating with OpenAI’s GPT-4 to improve accessibility for blind and low-vision people

One of the things that excites me about tech is the ability to change people's lives for the better. Be My Eyes is a global community that connects people who are blind or have low vision with sighted volunteers through live free video calls to assist with daily tasks from choosing clothing of a particular colour to reading a label or navigating an outdoor space. 

Recently the company announced a collaboration with OpenAI to introduce a new AI-powered Virtual Volunteer feature. I spoke to Chairperson and CEO Michael Buckley to learn more.

Tech for good

Be My Eyes launched in January 2015 - within 24 hours of launch, the app had more than 10K users. 

Fast forward to today, where about half a million people who are blind or have low-vision are using the app, supported by an astonishing 6.3 million volunteers. It operates in 150 countries in 180 languages. The average call only lasts about three minutes. There's a 90% success rate – most "failures" occur due to telecom or tech issues.

But with that level of success, why embark on AI integration?

The company had been talking with OpenAI for a while about potential ways to work together. They shared news of a plan to launch a new product and invited Be My Eyes to demo it. 

A five-week collaboration followed to develop an integration for beta testing, which provided an opportunity for even better services.

Community research by Be My Eyes revealed that members of the blind and low-vision community sometimes hesitate to make calls for three key reasons: 

  • They wanted to avoid taking a volunteer from someone else who might need them more.
  • Discomfort with calling a stranger.
  • Sometimes making calls makes them feel less independent. 

Buckley asserts:

"When we put the power of artificial intelligence through the virtual volunteer product into this community, it solves all of those barriers to accessibility and independence." 

Furthermore, AI integration can do the same thing as sighted volunteers, and more as beta testing Virtual Volunteer reveals a powerful ability to add a "tremendous layer of context."

For example, taking a photo of the inside of your fridge can tell you not only what's inside your fridge but also suggest five recipes for dinner based on those ingredients. 

@lucyedwards AD I’m blind, let me show you how AI is going to change your life… @bemyeyesapp ♬ original sound - British blind girl 👁🦮👩‍🦯

Buckley recalled: 

"We took a picture of the railway system in India and asked, 'How do we get from Bangalore to Delhi?'

It told us specifically which lines had to be taken and how. There's not really another visual assistance product that can do anything like that. A volunteer may not have that depth of information."

Safeguarding users 

I was curious if there might be instances where AI is biased or doesn't give you an accurate answer, such as with AI hallucinations, where a question elicits confident responses by an AI that do not seem to be justified by its training data.

The team at Be My Eyes repeatedly tried to break the product but was unsuccessful:

"We haven't seen a really bad example of it doing something dramatically wrong or unsafe."

However, there are still some minor errors to iron out. Buckley recounted using Be My Eyes to go Amazon shopping, where the tool identified a page of toasters (sans text) as slow cookers. 

There's also a built-in safeguard where if you ask it a question about a photo, and it doesn't know how to help you, it then says, "Would you like to call a sighted volunteer for assistance."

However, Buckley relies on feedback from the Be My Eyes community rather than simply waxing lyrical himself. He recalled:

"Half a dozen people have used the phrase' life changing'. A woman told me yesterday that this is helping her regain her independence. A gentleman late last night that I was on a WhatsApp thread said, 'For me, this is power I've never had before.' This gives me great, great hope for how this technology can empower people."

The company plans to gradually increase the number of beta testers over the coming weeks, hoping to roll the product out later this year. 

"We're incredibly excited about the possibilities, but also trying to be thoughtful, slow and measured and make sure that we iterate on this, with the direct impact of the blind and low vision community." 

Embedding accessibility into the world's biggest companies 

If done well, the Virtual Volunteer can potentially roll out even further. Be My Eyes works with enterprises like Microsoft, Procter and Gamble, Spotify, LinkedIn and Verizon in their customer call centres to make customer support more accessible for blind and low-vision people.

This provides a way to generate revenue and ensure that Be My Eyes will always be free to its users. 

OpenAI supports the company in offering the Virtual Volunteer free of cost. 

Follow the developments in the technology world. What would you like us to deliver to you?
Your subscription registration has been successfully created.