Article Lead Image

The field of soft robotics is all about a gentle touch

A gentle touch.


Dylan Love


Posted on Oct 2, 2015   Updated on May 27, 2021, 9:12 pm CDT

Robots are developing a more sensitive touch.

Rather than use conventionally robust metal and plastic, MIT graduate student Bianca Homberg led a team of researchers in creating a next-generation robotic hand made out of silicone. Not only does the gentler material enable a robot to grip and pick up delicate items that would be difficult or impossible for other hands to handle, but it has a sense of “touch” that can identify what it is holding based on how its fingers wrap around an item.

This is an example of “soft robotics,” the notion that robots can be pliable and flexible while remaining useful, rather than “hard” and industrial-looking. A softer form factor enables robots to do things we might not think them conventionally capable of. In 2o14, MIT developed a fish robot that illustrates the concept quite well; without any pistons or actuators, the fish was able to swim underwater and execute acrobatic maneuvers like a living thing.

Homberg’s hand is promising not for its acrobatics, but for its delicacy.

“With a rigid hand,” she told CNBC, “there has to be a lot of complicated grasp planning to figure out how exactly it is going to pick up the object—where it is going to put its fingers so that it doesn’t drop [the object]. With a soft hand you just grab it and the fingers bend around the object and pick it up.”

For the hand to pick up a dainty little egg, for example, it doesn’t need to calculate the movements for a number of rigid joints to precisely grip the egg. The silicone form factor means that it only needs to be close enough in its movements; the robot doesn’t need to be preoccupied with precision in order to successfully accomplish its task.

Just as some robots make some sense of the world by running computer vision algorithms on a video feed, this three-fingered hand employs “bend sensors” to form a model of the size and shape of the object it is holding. The sensors feed this data back to the robot’s processor, which compares it to its collection of data on other objects that it’s held in order to identify what it’s holding. On some level, a robot can “see” with this hand.

“With just three data points from a single grasp,” writes Adam Conner-Simons on, “the robot’s algorithms can distinguish between objects as similar in size as a cup and a lemonade bottle.”

The days of robot-assisted dishwashing loom ever closer.

H/T | Screengrab via MIT CSAIL/YouTube

Share this article
*First Published: Oct 2, 2015, 1:55 pm CDT