- Is Trump defiling the U.S. flag in this MAGA dude’s artwork? Sunday 4:41 PM
- White woman claims she invented sleep bonnets, selling them for $100 Sunday 4:03 PM
- Even real cats are transfixed by the enigma that is the ‘Cats’ trailer Sunday 3:04 PM
- Wait, how tall is Peppa Pig? Sunday 1:55 PM
- Twitter suspends Iranian state media outlets for harassing members of a religious minority Sunday 1:06 PM
- Pro-MAGA pageant queen stripped of title over ‘offensive’ tweets Sunday 11:52 AM
- Marvel unveiled its Phase 4 plans at San Diego Comic-Con Sunday 9:16 AM
- How a queer Instagram is helping fight the opioid epidemic in Appalachia Sunday 6:30 AM
- Philadelphia to fire 13 officers for racist, violent Facebook posts Saturday 6:12 PM
- Nick Offerman is so down to play every single role in ‘Cats’ Saturday 4:27 PM
- Woman documents how airport staff broke her wheelchair Saturday 3:04 PM
- Funeral home allegedly posted photos of woman’s dead body on social media Saturday 1:56 PM
- Alinity Divine is being investigated after throwing her cat during stream (updated) Saturday 12:04 PM
- ‘Comedians In Cars Getting Coffee’ returns with Seinfeld making a racist joke about China Saturday 10:26 AM
- YouTubers Eugenia Cooney and Shane Dawson make a joint comeback Saturday 9:06 AM
With a 3D camera and braille feedback, the system helps people interact with the world.
A tiny chip could be the powerful, driving force behind new technologies that let the visually impaired navigate the world.
Researchers at the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Laboratory developed a chip and navigation system, or virtual “guide dog,” that’s worn around the neck. It uses a Texas Instruments 3D-video camera and algorithms to analyze the space in front of someone and tell them what the camera sees. The computer reports back to the wearer, who has a handheld braille interface, and can be alerted when something will physically appear in front of them.
Dongsuk Jeon, who was a postdoc at MIT’s Microsystems Research Laboratories when this technology was developed, explained that the handheld device can determine a safe distance to walk in different directions, and it uses Bluetooth to send the data from the computer to the device. It has five columns with four balls in each column that react to objects in front of the wearer.
“For instance, if there is nothing in front then all the pins are down,” Jeon said in an email to the Daily Dot. “If some obstacle appears in one direction from far away, only the top pin in that column will go up. As that obstacle gets closer, more pins will go up in that column towards the bottom.”
The navigation system that’s about the size of a pair of binoculars is still a prototype, but once it moves beyond the research facility, it has the potential to change the way visually impaired individuals interact with the spaces around them. Jeon said they are working on something even smaller that’s the size of a flashlight; the wearer would simply point it in any direction and it would recognize what’s in front of a person.
The chip’s processing power matches that of common processors, but only requires one-thousandth as much power as one that’s running the same tech. Anything the 3D-video camera captures is converted into something called a point cloud, the university explains, or the representation of the camera’s footage based on surfaces of objects. The navigation system runs an algorithm to identify the point cloud and determine how close things are to one another and the person.
Additionally, researchers worked to lower power consumption by modifying the point cloud algorithm to analyze objects in a set pattern instead of sporadically, and to know when a person isn’t moving to cut down on power.
MIT’s system isn’t the first to use 3D aides to help visually impaired individuals, however it is the smallest.
Intel, for instance, is pairing computer vision with haptic feedback. The company created customized clothing using its RealSense 3D technology that turns the wearer into a walking environmental scanner. Similar to MIT’s navigation system, the Intel project scans and analyzes the environment, and vibrates to alert the wearer of impending objects.
The virtual “guide dog” system developed by MIT is a practical, fully realized solution. However, there’s still work to be done before it’s available to consumers.
“If there is enough financial support, this may be available in the market within few years,” Jeon said.
Photo via lissalou66/Flickr (CC BY ND 2.0)
Selena Larson is a technology reporter based in San Francisco who writes about the intersection of technology and culture. Her work explores new technologies and the way they impact industries, human behavior, and security and privacy. Since leaving the Daily Dot, she's reported for CNN Money and done technical writing for cybersecurity firm Dragos.