Integration of Midas and iGesture

Type of Thesis: 
Master Thesis

While there exists a variety of gesture recognition frameworks, none of them addresses the issues of supporting both application developers as well as the designers of new recognition algorithms. iGesture supports application developers who would like to add new gesture recognition functionality to their application as well as designers of new gesture recognition algorithms. The iGesture framework can easily be configured to use any of the existing recognition algorithms (e.g. Rubine, SiGeR) or customised gesture sets can be defined. Furthermore, our test bench provides tools to verify new gesture recognition algorithms and to evaluate their performance.

 

 

On the other hand, Midas is a declarative model for the definition and detection of multi-touch gestures where gestures are expressed via logical rules over a set of input facts using spatio-temporal operators. The model addresses necessary software engineering abstractions, such as modularisation, composition, event categorisation and GUI-event correlation.

 


 

In this thesis we plan to integrate the two different approaches in order to:

  1. Reuse implemented iGesture algorithms for the Midas framework (e.g. Rubine, SiGeR)
  2. Provide a declarative interface as an additional algorithm for iGesture
  3. Use the iGesture workbench to debug and evaluate gestures implemented in Midas

 

Background Knowledge: 
  • Java
  • C (optional)
Technical challenges: 

Asynchronous coupling of two gesture recognition framework

Contact: 
Beat Signer
Contact: 
Bruno Dumas
Contact: 
Lode Hoste
Academic Year: 
2011-2012