3-Dimensional Gesture Recognition: Algorithms and Applications

Student Name
Johan Bas
Thesis Type
Master Thesis
Thesis Status
Academic Year
2010 - 2011
Master in de Toegepaste Informatica
Beat Signer

When Nintendo released its Wii gaming platform, they introduced motion sensing to the general public. Motion sensing is a technique which can help users to control other electronic devices by performing hand or full body movement. Motion sensing is a more natural way to communicate with computers compared to keyboards, mice, buttons or other input devices. Motion sensors are capable of measuring specific physical quantities including acceleration, rotation, magnetic fields. Most consumer electronics are equipped with these motion sensors to provide a better user experience. For example, photo cameras may have a tilt sensor, which is used to sense whether a picture has been taken holding the camera in portrait or landscape mode. The motion sensors embedded in electronic devices can not only provide data to the device itself, but also provide input for other applications users want to interact with. This enables developers to create applications that can be controlled trough user motion without having to develop and sell specific motion sensing hardware.

The iGesture framework has been developed to help application developers to register and recognise gestures. Originally, iGesture supported 2D gestures recorded with a digital pen or a mouse. Later, 3D gesture recognition has been added to the framework. iGesture distincts itself from other frameworks trough its support of multiple input devices. Developers can add new devices to the framework in such a way that existing gesture recognition algorithms can be reused.

As part of this thesis, we first did a thorough investigation of common sensors and how they can be used in gesture recognition. In addition, we analysed existing devices for motion sensing and inspected their capabilities. Beside these two studies, the iGesture framework has been analysed to get familiar with its current support for 3D gesture recognition.

In a second phase, support for new IP-based devices has been integrated into iGesture. IP-based devices are input devices with embedded motion sensors capable of communicating using the IP protocol. An Android application has been developed to support iGesture's IP-based communication on motion sensing devices running the Android operating system. The addition of new devices introduces more exibility and provides a larger collection of devices for future iGesture developers to choose from. Besides the integration of these new devices, a 3D gesture recognition algorithm based on the Dynamic Time Warping algorithm has been developed and integrated.

Finally, an extensive set of tests to evaluate the recognition rates of our new algorithm has been performed. Different input devices were used by different users to perform various gestures. The results show that the current algorithm implementation in combination with the existing input devices and new Android input devices can be used for 3D gesture recognition.