I have been looking forward to this device ever since it popup up as an alternative method to hand tracking. It works by analysing the electrical impulses going through your arm to understand the gestures that you hand and wrist are doing and combining this with an IMU to track where your arm is. It connects via bluetooth and there are Windows, Mac, Unity3d, iOS and Android SDKs to work with. It makes quite a few promises of what can be achieved I cannot wait to get started with it.
The first thing when unboxing the Myo was that it came with a bluetooth dongle which was great so many accessories don’t have one and my Mac which is showing it’s years cannot connect to some. I had no issue with this was got stuck in to the setup procedure. The getting started was great it stepped you through the difference gestures of which there are:
- Wave out
- Wave In
- Spread fingers
- Thumb and pinky touch
- Activation Gesture
I did find that the sensor worked better on my left arm and that I had to touch the knuckle of my pinky rather than the end of the finger I assume as I have large hands it needs a slightly larger action to detect, but this is a developer device.
Out of the box you can control powerpoint and itunes on windows/mac. These both work well and with a bit of trial and error I discovered the gestures that worked reliably. I found that the back gesture from a wave sometimes triggered the counter gesture so it would take me forward and back one slide in one go. I also tried out myo with my iPhone and it only controlled my music library of which I have none as I use spotify but again I was pleased with how responsive and easy to use it was.
So what else can you get it to do? I don’t think the true potential has yet been realised for this device. The ways that you can get more functionality are to leverage the Mac, Windows, iOS or Android SDKs. There is also a Unity3d package that you can include to quite quickly integrate. I have tried this and the current demo is a stick with a lump on the end. The stick moves in sync with the IMU and the lump changes colour based on actions as you can see in the image below. This is a way off the demo videos shown with accurate finger tracking but it is a start.
As an alternative method for creating interaction mechanisms for MYO there is Lua scripts. You can code scripts that take into consideration what app is running and then run simple inputs like key presses from those interactions. I have tried out Spotify integration and am looking at creating a few others.
Overall I am impressed with the Myo I think that is a great idea for a device. There are may benefits to the device such as it is easily concealed, will work anywhere I go without the need to setup or be facing cameras and it is high tech. There are some bug bears such as it leaves an impression on the arm after a while so wouldn’t condone prolonged use, there is no way to train new gestures and the sdks could do with more of where they say they are going. This is a developer device and all of these features will come and I cannot wait.