- Did Pete Buttigieg seriously just rip-off a famous Obama speech? 4 Years Ago
- The most dangerous TikTok challenges we’ve seen—so far 4 Years Ago
- PewDiePie wants Bernie Sanders to host meme review Today 1:44 PM
- Hilary Duff records confrontation with ‘creep’ taking photos of kids Today 1:08 PM
- BTS may have used Twitch streamer’s voice in song without permission Today 12:15 PM
- Gigi Hadid absolutely obliterates Jake Paul over Zayn Malik diss Today 10:26 AM
- People really want Chris Matthews fired after he compared Sanders’ Nevada win to Nazi invasion of France Today 9:35 AM
- Bernie Sanders wins Nevada Caucuses Saturday 6:54 PM
- MSNBC is out of its mind over Sanders leading Nevada Saturday 5:20 PM
- Kim Kardashian dragged for using makeup to darken her hands Saturday 4:13 PM
- TikTok users show how they turned their vehicles into incredible tiny homes Saturday 3:44 PM
- Woman iconically pranks man who sent her an unsolicited d*ck pic Saturday 2:25 PM
- ‘Terrifying’ deepfake puts Jeff Bezos and Elon Musk in ‘Star Trek’ Saturday 1:06 PM
- A 36-year-old called the cops after being booted from parents’ phone plan Saturday 12:16 PM
- People think novelist Dean Koontz predicted the coronavirus in 1981 thriller Saturday 10:22 AM
Using nothing more than blinks and glances, it’s possible to manipulate a robotic arm well enough to paint a simple picture.
Dr. Aldo Faisal of Imperial College London’s Departments of Computing and Bioengineering successfully demonstrated a system that lets a human user control a paintbrush-wielding robot arm using only his or her eyes. He suggests this might lay the foundation for humanity’s future of robot-enabled multitasking.
Using technology that tracks the user’s eyes, the robot is able to receive and understand a variety of commands, even to clean the paintbrush and prepare it with a different color. Once the robot is holding a paint-dabbed brush, it’s apparently pretty intuitive to use. Sabine Dziemian, a postgraduate in Faisal’s research group, says, “If I want to draw a straight line, I look at the start point and the end point, and the robot moves the brush across that line.”
Blinking three times puts the robot in color selection mode, in which it moves the brush over to a variety of pre-dispensed colors. At that point, the user only needs to look at the color he or she wants to use next, and the arm applies the color to the brush.
“Since time immemorial, human imagination has sparked the idea of having additional arms,” says Faisal. He invokes the multiarmed Hindu goddess Shiva, often a symbol of transformation, to suggest we might one day “do the dishes while taking a phone call or [holding] your baby and [preparing] the food at the same time, because you have just that extra pair of hands attached to you.”
The future of robotic prosthetics is developing rapidly. Last month, researchers at Sandia National Laboratories made significant progress in wiring a prosthetic device to the human brain, while leading manufacturer Ossur has developed a battery-powered knee that optimizes movements based on the wearer’s natural gait.
Taken as a whole, these developments could not only help make able-bodied humans more productive, but they also have immediately obvious potential for the disabled, who might use remotely controlled robotic arms to maintain more independence.
“If you’re controlling a third arm, you want it to be natural, seamless, and you don’t want to think about it,” Faisal says.
Dylan Love is an editorial consultant and journalist whose reporting interests include emergent technology, digital media, and Russian language and culture. He is a former staff writer for the Daily Dot, and his work has been published by Business Insider, International Business Times, Men's Journal, and the Next Web.