Use this URL to cite or link to this record in EThOS:
Title: Mobile multimodal user interfaces
Author: Kernchen, Jochen Ralf
ISNI:       0000 0004 2697 0128
Awarding Body: University of Surrey
Current Institution: University of Surrey
Date of Award: 2010
Availability of Full Text:
Access from EThOS:
Access from Institution:
A multitude of information, single and structured multimedia contents are available from the World Wide Web at ever increasing variety, in different modalities (e. g. audio, text, and video) and formats. Yet users of Mobile and Web applications are typically not experiencing these in original quality or layout, due to the form factor limitations of their mobile devices. Awkwardly, the surrounding devices which could accommodate a better experience are mostly left out in Mobile application interaction scenarios. On the way to achieve a true Mobile Multimodal User Interfaces experience, dynamic user interface adaptation for Mobile applications has to be tailored to pervasive multi-device environments and automated adaptation should put users in the middle. Relevant context information, like the available devices or user’ location has to be considered to achieve a new user interaction experience. One of the key problems is the lack of suitable device descriptions for multimodal adaptation and their consistent integration through various discovery mechanisms (e. g. Bluetooth, UPnP, SIP). Further, optimal delivery of multimedia content across different devices with their varying capabilities, especially using structured or complex multimedia has not been studied thoroughly yet. Finally user interaction should be enabled across all available devices, so that a user can freely interact using their preferred input modality. According to the stated problems, the research in this thesis focuses on modality device descriptions and discovery, intelligent multimedia presentation delivery and exchangeable modalities application control. Contributions include a system architecture definition for Mobile Multimodal User Interfaces, named the User Interface Adaptation Function (UIAF), a device description approach for user interface devices suitable for quality comparison with multimedia contents, mechanisms for inclusion of different discovery means in one consistent discovery framework, a framework for exchangeable modalities application control and an extensive adaptation process definition for multimedia presentation delivery. Background information is provided by a thorough related work review for multimodal user interfaces, multimedia and ubiquitous computing with challenges identified. Based on this analysis the UIAF system design is presented. The proposed new functionalities ate defined within the system architecture. Finally, the scenario based evaluation presents two realisations of the UIAF for mobile terminal and service platforms.
Supervisor: Not available Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID:  DOI: Not available