We spoke to co-creator Alberto Rizzoli.

Restoring independence to the visually impaired? There’s an app for that.

Aipoly is a smartphone app that pairs machine vision with artificial intelligence to audibly describe whatever the smartphone’s camera “sees.” 

The app was developed by Marita Cheng and Alberto Rizzoli, technologists who collaborated at Singularity University to create something that would be useful to the 285 million vision-impaired people around the world. According to Cheng and Rizzoli, two-thirds of this population will become smartphone users in the next five years.

Inspiration struck when they attended a presentation by IBM Watson Group CTO Robert High. High demonstrated some of the celebrity supercomputer’s capabilities—show it a picture and it can provide a semantic, conversational description of what’s happening in it. “We started looking into technologies to recognize images,” Rizzoli told the Daily Dot. “We learned about neural networks and integrated this into an application. It’s the simplest possible process for a user to identify an image: press a button, receive an audio description.”

The Aipoly software works by dividing an image into sections and running reverse image searches on them. It identifies the nouns in a picture—”car,” “battery,” “dog”—as well as the adjectives, like “silver” or “shiny.” Then artificial intelligence steps up to the plate to turn the computer’s understanding of the image into something for a human to digest. Audio playback might tell a visually impaired user that he is looking at “a shiny, silver car.”

The demo video shows it in action:

This is still an experimental technology. Once perfected, a visually impaired individual might be able to use this app to recognize what’s on a plate of food or to take pictures of their children to identify how they’re dressed. Rizzoli told us about one user who was passionate about cars, so they walked around a parking lot together until they successfully identified a Tesla using the app.

For now, there is some human help taking place behind the scenes to help Aipoly accurately identify images, but Rizzoli tells us it will soon be 100 percent software-based.

He has big ambitions for the future as well, and imagines using Aipoly to create something of a Google Street View for the blind. “We can build a virtual model of the world so that users don’t have to keep scanning their environment,” he said. “The info is already there, and Aipoly would one day provide them with realtime feedback.”

Rizzoli is proud of the autonomy that the app might afford to those with vision impairments. “It makes the visually impaired more independent, and it enables them to explore the world.”

Photo via Aipoly

IRL
The blind community’s fight for a more accessible Web
In many ways, accessibility for websites is like accessibility for a building: It's a few key changes that can make all the difference.
From Our VICE Partners

Pure, uncut internet. Straight to your inbox.