Back in 2008, I did some work on 2D and 3D gesture recognition. I have a few papers (listed in the Publications section of the About page) and a few YouTube videos (here, here, and here) that discuss some of that work. Recently, I’ve resurrected that 3D gesture classification project and have made the code available on Bitbucket.
This system uses one or two Nintendo Wiimotes and presents a WPF user interface for training and classification. Here’s a screenshot.
Updating this project was triggered by a lecture series on game-related AI, and provided the opportunity for me to hone some WPF skills. In particular, this code adheres to the MVVM pattern and employs the Prism and Unity libraries (the Unity Container for dependency injection — not to be confused with Unity 3D the game engine).
I’ve included some sample training data for left-hand, right-hand, and combination gestures (both hands) that mimic a few military hand signals. For example, Attention is waving your hand back and forth above your head, and Halt is raising your hand straight up above your head. If you have trouble getting these gestures to classify, it’s best for you to train some of your own gestures (so you know exactly how they should be performed).
The system isn’t limited to Nintendo Wiimotes and should be usable by any device that produces 3-axis accelerometer data. I’ve done exactly that for an accelerometer-equipped glove. I’ve also used the same underlying system for 2D classification and have experimented with different machine learning algorithms (e.g. AdaBoost). The classifier included in this software is a statistical classifier and this project is based on Dean Rubine work’ on gesture recognition.
If you have any questions, just shout.