- Reddit theory says fans are wrong about who won ‘Game of Thrones’ Tuesday 6:52 PM
- Elon Musk hires ‘absolute unit’ sheep meme creator to be Tesla’s social media manager Tuesday 6:12 PM
- Jason Momoa stands by his Khaleesi after the ‘Game of Thrones’ finale Tuesday 4:05 PM
- Airbnb, 23andMe partner for creepy heritage travel recommendations Tuesday 3:26 PM
- Rep. Katie Porter goes viral again for trouncing Ben Carson (updated) Tuesday 3:26 PM
- This deepfake takes Bill Hader’s Schwarzenegger impression to the next level Tuesday 2:58 PM
- Wanda Sykes rails against Trump and offers much-needed perspective in ‘Not Normal’ Tuesday 2:41 PM
- Man arrested after allegedly threatening to shoot YouTube employees Tuesday 2:13 PM
- Some House Dems are backing away from the Save the Internet Act Tuesday 1:40 PM
- Thousands sign petition calling for Danny DeVito to play Wolverine Tuesday 1:02 PM
- Jason Mitchell fired from ‘Desperados’ and ‘The Chi’ after misconduct allegations Tuesday 12:36 PM
- Police raid Black woman’s house after white neighbor complains about loud Malcolm X speeches Tuesday 12:20 PM
- ‘Transfixed’ says it’s a ‘breakthrough’ series, but it still fetishizes trans bodies Tuesday 11:04 AM
- Senator proposes Do Not Track bill to allow consumers to opt out of data gathering Tuesday 10:54 AM
- The Queen of the North likes to Juul Tuesday 10:36 AM
Will eye-controlled prosthetics enable the next wave of human productivity?
Using nothing more than blinks and glances, it’s possible to manipulate a robotic arm well enough to paint a simple picture.
Dr. Aldo Faisal of Imperial College London’s Departments of Computing and Bioengineering successfully demonstrated a system that lets a human user control a paintbrush-wielding robot arm using only his or her eyes. He suggests this might lay the foundation for humanity’s future of robot-enabled multitasking.
Using technology that tracks the user’s eyes, the robot is able to receive and understand a variety of commands, even to clean the paintbrush and prepare it with a different color. Once the robot is holding a paint-dabbed brush, it’s apparently pretty intuitive to use. Sabine Dziemian, a postgraduate in Faisal’s research group, says, “If I want to draw a straight line, I look at the start point and the end point, and the robot moves the brush across that line.”
Blinking three times puts the robot in color selection mode, in which it moves the brush over to a variety of pre-dispensed colors. At that point, the user only needs to look at the color he or she wants to use next, and the arm applies the color to the brush.
“Since time immemorial, human imagination has sparked the idea of having additional arms,” says Faisal. He invokes the multiarmed Hindu goddess Shiva, often a symbol of transformation, to suggest we might one day “do the dishes while taking a phone call or [holding] your baby and [preparing] the food at the same time, because you have just that extra pair of hands attached to you.”
The future of robotic prosthetics is developing rapidly. Last month, researchers at Sandia National Laboratories made significant progress in wiring a prosthetic device to the human brain, while leading manufacturer Ossur has developed a battery-powered knee that optimizes movements based on the wearer’s natural gait.
Taken as a whole, these developments could not only help make able-bodied humans more productive, but they also have immediately obvious potential for the disabled, who might use remotely controlled robotic arms to maintain more independence.
“If you’re controlling a third arm, you want it to be natural, seamless, and you don’t want to think about it,” Faisal says.
Dylan Love is an editorial consultant and journalist whose reporting interests include emergent technology, digital media, and Russian language and culture. He is a former staff writer for the Daily Dot, and his work has been published by Business Insider, International Business Times, Men's Journal, and the Next Web.