I downloaded Myo Connect Beta, launched it and then slid the Myo band onto the thickest part of my forearm with the logo facing up. The video mentioned that the device needs to be snug and have a direct skin connection so that the detectors (electromyographs) surrounding the band can sense the electrical signals produced by muscle movement. Then it just needed to “warm up” for a few minutes before syncing could begin.
Syncing is easy: Hold your arm like you’re a waiter at a restaurant, and sweep it out, as if to say, “Look at this delicious Baked Alaska.” You’ll get a little buzz from the Myo if you do it correctly. The set up then walks you through a series of interactions that use the Myo’s main gestures.
According to the ThalmicLabs website, the band also contains a highly-sensitive nine-axis IMU containing three-axis gyroscope and three-axis magnetometer. I’m assuming that this is to report motion, speed, pitch, and yaw, but only rotation detection was immediately evident in the demo.
Controls for a few popular programs like Spotify, Keynote, and Acrobat Reader are built into Myo Connect, so I fired up Spotify and gave it a go. Here’s what I found:
Gesture recognition can be coarse
It feels like you’re playing Rock-Paper-Scissors with your computer and the computer keeps winning. Spotify is tricky to control; the Myo recognized my gestures about 50 percent of the time. Fist gestures were recognized around 10 percent of the time.
I think that part of the reason was user error. It is unclear how long the “unlocked” state (where the Myo recognizes gestures) lasts for. Sometimes the Myo recognized a gesture while locked. Other times, we needed to perform the unlocking gesture and immediately perform the second gesture for recognition to occur. Staring out, this looked hilarious, but quickly became embarrassing when I cranked Dolly Parton’s “Jolene” all the way up and couldn’t get it to turn back down.
These poor results could have just been me, so I handed the Myo around to a few of my Instrument teammates to try out, and the results were the same. We tried recalibrating via Myo Connect. Again, we experienced the same results.
We switched applications to HandyBrowser, and the results were a little better. The app had a few suggestions that helped, including a warning about combo-gesture timeout, and helpful directions about disabling auto-lock. I’m not sure about the fist-gesture based scroll, but it worked better than fist-gesture based volume on Spotify.
Myo should make an experience better
Keynote was our last test, and it turned out to be the most popular use of the Myo among the designers. It was simple and it worked every time. The app relies on the three gestures that are most consistently recognized by the Myo: unlock, wave left, and wave right. That’s really all that Keynote needs, and in this case, the doing less actually makes for a better experience.
Fatigue is a real issue
I got the best response from the Myo when a gesture used a lot of muscle contraction. I found myself making movements that fell outside of my standard range of motion, which caused fatigue in a moderate amount of time. I assume that this will get better as the Myo evolves, but in the meantime, something for experience designers to consider:
Not all gestures are equal
Waving outward fights against the way our wrists work. Waving toward the body is a lot easier and can be sustained longer, even if that motion needs to be extreme. The double tap was the second easiest gesture to make, and was almost always recognized by the sensors. Making a fist and then spreading your fingers requires the most effort and may be the most difficult for users with motor control issues or arthritis.
Good experience design can smooth the rough edges
The Myo has the potential to be a nice interface device, and even in these early stages there are some applications that are good enough to use. It would be easy to make them better. Take Spotify for example. These are the current interactions:
Waving toward your body to go back to the previous track and closing your hand to control volume feel counterintuitive. Opening your fingers to let the music flow feels right, but in practice to consistently trigger it I found that I needed to go from a closed fist out to spread fingers and that could trigger the volume control.
Here is how I would modify the interaction:
First, do away with the timed unlock or at least extend it to five seconds so you can get a gesture off and recognized before time runs out. Double tap is the most reliable gesture, so it should be paired with the most important functions, start and stop. The fist gesture is gone, replaced by the open hand combined with rotation to control volume. Slow wave in to scroll/pan through tracks and a double wave to advance one track at a time. A double wave out loads the previous track.
I think that ThalmicLabs is onto a good thing here and I’m looking forward to seeing how the Myo evolves over time. I assume that the coarseness of some of the gesture recognition will be resolved as more people use the device and ThalmicLabs acquires a bigger dataset to draw upon. With this data should come greater control and less reliance on the kind of broad gestures that characterize this current moment in interaction design. Hopefully we will eventually have controls that can seamlessly be incorporated into everyday life and take me ever further away from touching a screen.
Photo via ThalmicLabs