Will you Listen to Me? Conflict Resolution in Multimodal Interfaces
Type of Thesis:Master Thesis
Multimodal interfaces are user-machine interfaces making use of numerous different input modalities including speech, gestures or emotions. Multimodal interfaces offer a number of specific advantages compared to standard GUI interfaces due to the concurrent management of the different modalities. They further offer the possibility to combine different modalities to increase the expressive power and provide a more natural interaction with applications.
The extraction of meaningful information from different modalities occurring in parallel is a complex problem. This problem becomes even more complex if one takes the inherently unpredictable user behaviour and the imperfect performance of probabilistic recognisers into account. For example, if a user says to the computer "move this triangle there" while pointing to a circle on the screen, the computer finds itself in a delicate situation: does it still acknowledge the command or not? In this thesis we will investigate and come up with effective algorithms for conflict resolution in multimodal interaction. The results of this thesis will be integrated with an existing framework for the creation of multimodal interaction that is currently developed by WISE lab members [1,2].
 Lode Hoste, Bruno Dumas and Beat Signer, Mudra: A Unified Multimodal Interaction Framework, Proceedings of ICMI 2011, 13th International Conference on Multimodal Interaction, Alicante, Spain, November 2011.
 Bruno Dumas, Frameworks, Description Languages and Fusion Engines for Multimodal Interactive Systems, PhD Thesis no. 1695, Faculty of Sciences, University of Fribourg, Switzerland, December 2010.
- Come up with a list of conflicting cases for multimodal input based on a literature study
- Study existing conflict resolution algorithms based on the multimodal interaction frameworks developed within the WISE lab
- Study how conflict resolution can be linked with multimodal interfaces adaptation