- Who is Corn Pop? Here are all the theories about the gang leader from Joe Biden’s past Sunday 4:37 PM
- Fresh sexual misconduct allegations against Kavanaugh spur calls for impeachment Sunday 3:28 PM
- Mike Pence says a triple crown winning racehorse bit him Sunday 12:51 PM
- Disney CEO Bob Iger leaves Apple board amid streaming wars Sunday 12:01 PM
- Influencer Destiny Marquez faces backlash for berating Forever 21 employee Sunday 10:32 AM
- Chelsea Handler tackles system racism in ‘Hello Privilege. It’s Me, Chelsea’ Sunday 9:18 AM
- Gun control proposal: Trump, lawmakers considering background check-conducting app Sunday 9:05 AM
- How to stream Browns vs. Jets on Monday Night Football Sunday 7:00 AM
- What are anons? Sunday 6:30 AM
- How to stream Eagles vs. Falcons on Sunday Night Football Sunday 6:00 AM
- How to stream ‘Power’ season 6, episode 4 Sunday 5:00 AM
- How to stream WWE’s Clash of Champions 2019 Saturday 8:00 PM
- How ‘F*ck off Scotland’ became a Scottish rallying cry amid Brexit madness Saturday 6:28 PM
- A Missouri officer resigned after his Islamophobic Facebook posts surfaced Saturday 5:08 PM
- Adding ‘Triggered’ to stock photos of white men creates Netflix comedy special thumbnails Saturday 3:10 PM
A tiny chip could be the powerful, driving force behind new technologies that let the visually impaired navigate the world.
Researchers at the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Laboratory developed a chip and navigation system, or virtual “guide dog,” that’s worn around the neck. It uses a Texas Instruments 3D-video camera and algorithms to analyze the space in front of someone and tell them what the camera sees. The computer reports back to the wearer, who has a handheld braille interface, and can be alerted when something will physically appear in front of them.
Dongsuk Jeon, who was a postdoc at MIT’s Microsystems Research Laboratories when this technology was developed, explained that the handheld device can determine a safe distance to walk in different directions, and it uses Bluetooth to send the data from the computer to the device. It has five columns with four balls in each column that react to objects in front of the wearer.
“For instance, if there is nothing in front then all the pins are down,” Jeon said in an email to the Daily Dot. “If some obstacle appears in one direction from far away, only the top pin in that column will go up. As that obstacle gets closer, more pins will go up in that column towards the bottom.”
The navigation system that’s about the size of a pair of binoculars is still a prototype, but once it moves beyond the research facility, it has the potential to change the way visually impaired individuals interact with the spaces around them. Jeon said they are working on something even smaller that’s the size of a flashlight; the wearer would simply point it in any direction and it would recognize what’s in front of a person.
The chip’s processing power matches that of common processors, but only requires one-thousandth as much power as one that’s running the same tech. Anything the 3D-video camera captures is converted into something called a point cloud, the university explains, or the representation of the camera’s footage based on surfaces of objects. The navigation system runs an algorithm to identify the point cloud and determine how close things are to one another and the person.
Additionally, researchers worked to lower power consumption by modifying the point cloud algorithm to analyze objects in a set pattern instead of sporadically, and to know when a person isn’t moving to cut down on power.
MIT’s system isn’t the first to use 3D aides to help visually impaired individuals, however it is the smallest.
Intel, for instance, is pairing computer vision with haptic feedback. The company created customized clothing using its RealSense 3D technology that turns the wearer into a walking environmental scanner. Similar to MIT’s navigation system, the Intel project scans and analyzes the environment, and vibrates to alert the wearer of impending objects.
The virtual “guide dog” system developed by MIT is a practical, fully realized solution. However, there’s still work to be done before it’s available to consumers.
“If there is enough financial support, this may be available in the market within few years,” Jeon said.
Photo via lissalou66/Flickr (CC BY ND 2.0)
Selena Larson is a technology reporter based in San Francisco who writes about the intersection of technology and culture. Her work explores new technologies and the way they impact industries, human behavior, and security and privacy. Since leaving the Daily Dot, she's reported for CNN Money and done technical writing for cybersecurity firm Dragos.