The key to this new control system is ‘hands off.’
Being hands-off usually means relinquishing control. With Gest, it’s the exact opposite.
Remember Tony Stark in his lab in Iron Man, browsing through a massive amount of data with a literal swipe of his hand? That’s the type of control that Gest, a minimalist wearable hand tracker, wants to bring to your computer. (Presumably you don’t have a holographic display that fills the room of your private lab.)
Gest, developed by startup Apotact Labs, doesn’t look like much; its four wires with adjustable finger bands hang out from a plain black strap that wraps around the palm. But what it enables, thanks to the combination of accelerometers, gyroscopes, and magnetometers, is touchless control that would otherwise require a mouse or keyboard.
Apotact Labs CEO and co-founder Mike Pfister told the Daily Dot that the idea for Gest was born out of the frustration he felt trying to use tiny onscreen keyboards.
“The keyboard and mouse are starting to make less sense as our devices get smaller,” Pfister said. “We need a fundamentally new input method to control these devices—something that gets back to the basics as how we interact as humans. That’s why we created Gest.”
Pfister isn’t the first to explore the concept of gesture controls. Gest follows products like Manus, a glove designed primarily to improve interactions in virtual reality, and Elon Musk, SpaceX, and Leap Motion‘s completely accessory-free controls. But Gest appears to be the most practical and versatile option yet.
By far the most intriguing application of Gest is air typing—the ability to type with no keyboard.
It’s still an experimental feature at this point, and it appears from the demonstration that it relies on predictive text, like most smartphone keyboards, to correct for any inaccuracies. Being able to call on this feature anywhere would be a boon to mobile device users. It would also offer an input method for virtual reality that wouldn’t ruin the immersion.
Unlike a true motion tracker like Leap Motion or even the Xbox Kinect, Gest isn’t tracking exact motions. It’s producing actions based on input the same way double-clicking the mouse or typing the “D” key produces a given action. It’s just getting those commands from hand movements.
While this distinction provides considerable benefits for the device—it’s portable, the battery lasts longer, and it can connect to basically any device that supports Bluetooth Low Energy—it also represents an adoption hurdle. Gest won’t work with software that doesn’t contain special support for it.
When Gest launches—the current target date is November 2016—it will offer support for Photoshop. Pfister said that his team is “focused on building a really great experience with Photoshop using Adobe’s SDK [software development kit],” but he isn’t working directly with Adobe right now.
Apotact Labs has already opened up the Gest development kit with the hope of encouraging developers to build new applications, integrations, and use cases for the gesture control system. The company will also offer developers access to raw sensor data and custom skeletal models, as well as motion-processed data, in order to give them more information to inform their work and creations.
“We don’t need to work directly with developers,” Pfister explained. “It’s important to us that we give them the room to create and contribute on their own. We value that greatly.”
Gest is already well on its way to meeting its fundraising goal. Early-bird backers can score Gest for a $99 pledge. That’s just for one hand, though. When Gest eventually hits the market, it will retail for $200 per hand.