Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

‘Describe the shirt’: An AI-based app enhances independence for blind users

A photo of a smartphone taking a picture of a shirt resting on a bed.
Shelly Brisbin
/
Texas Standard
The new Virtual Volunteer software for the Be My Eyes app can help describe images to blind or visually impaired users.

Artificial intelligence tools like OpenAI’s ChatGPT tend to make people nervous about how they could impact jobs and education. But there’s at least one use for AI technology: a ChatGPT-based software offering from a company called Be My Eyes that has the potential to enhance independence for some people with disabilities.

Be My Eyes connects blind and visually-impaired users with sighted volunteers who provide assistance with tasks ranging from spring cleaning to finding a gate at the airport. Volunteers view the blind user’s environment via phone and video, and offer guidance based on what they see.

The service, which began in 2015, is available in 150 countries and 180 languages. Be My Eyes’ new Virtual Volunteer software uses AI to interpret still photos and answer questions posed in a text chat. The software will be part of the existing Be My Eyes mobile app when it leaves beta testing mode, sometime in the next few months.

A screenshot from the Be My Eyes app shows a picture of a shirt that was taken by a smartphone. A space is provided below the image where users can type in a question to ask the Virtual Volunteer software about the image.
Shelly Brisbin
/
Texas Standard
A screenshot from the Be My Eyes Virtual Volunteer software.

AI-based assistance is well suited to tasks that require straightforward description, or reading of text. A user can take a photo of a food package and hear cooking instructions spoken aloud to them. Show Virtual Volunteer a piece of clothing, and ask for a description. Virtual Volunteer will return information about the clothing’s color and style. “The shirt appears to be black, with a white pattern of small squares. The squares have lines and dots inside them.”

You can ask follow-up questions, too, making the Virtual Volunteer experience more interactive than some other AI-based tools. Take a photo of a room and ask the app where a lost object is. If the app doesn’t detect the object, it can ask you to take another photo, and you can refine your question to get a more specific response from the app.

Be My Eyes isn’t the first AI tool designed to assist blind people. Microsoft’s Seeing AI is a well-regarded free app that can identify colors, currency, the presence of light in a room, and even people. You can also use it to scan and read text aloud. What differentiates Be My Eyes’ offering is the ability to engage in a dialog with the user, and to provide richer interpretations of what it sees, using ChatGPT’s language model.

Be My Eyes CEO Mike Buckley says Virtual Volunteer can go beyond simply describing what it sees.

“One of my favorite examples is, you can take a picture of your refrigerator. It will tell you what the contents of the refrigerator are, and then you can say, ‘all right, what can I make for dinner, based on that?'”

So far, Virtual Volunteer is staying away from providing more opinion-based help. While the software will tell you about the color and design of a pair of pants and a shirt, it won’t offer advice about whether the pieces of clothing match, or look good together. For that, the app suggests tapping the button to contact a sighted volunteer for assistance.

If you found the reporting above valuable, please consider making a donation to support it here. Your gift helps pay for everything you find on texasstandard.org and KUT.org. Thanks for donating today.