Punamustan verkkokauppa
      0 tuotetta ostoskorissa  

Oulun yliopiston väitöskirjat


Kustantaja:Oulun yliopisto 
Oppiaine:Tekniikka, matematiikka 
Sijainti:Print Tietotalo 

25.00 €

Usability methodology has matured into a well-defined, industrially relevant field with its own findings, theories, and tools, with roots in applying information technology to user interfaces ranging from control rooms to computers, and more recently to mobile communications devices. The purpose is regularly to find out the users’ goals and to test whether a design fulfils the usability criteria. Properly applied, usability methods provide reliable and repeatable results, and are excellent tools in fine-tuning existing solutions. The challenges of usability methodologies are in finding new concepts and predicting their characteristics before testing, especially when it comes to the relatively young field of mobile user interfaces. Current usability methods concentrate on utilising available user-interface technologies. They do not provide means to clearly identify, e.g., the potential of auditory or haptic output, or gestural input. Consequently, these new interaction techniques are rarely used, and the long-envisioned useful multimodal user interfaces are yet to appear, despite their assumed and existing potential in mobile devices. Even the most advocated and well-known multimodal interaction concepts, such as combined manual pointing and natural language processing, have not materialised in applications. An apparent problem is the lack of a way to utilise a usage environment analysis in finding out user requirements that could be met with multimodal user interfaces. To harness the full potential of multimodality, tools to identify both little or unused and overloaded modalities in current usage contexts are needed. Such tools would also help in putting possibly existing novel interaction paradigms in context and pointing out possible deficiencies in them. In this thesis, a novel framework for analysing the usage environment from a user-centric perspective is presented. Based on the findings, a designer can decide between offering a set of multiple devices utilising various user-interface modalities, or a single device that offers relevant modalities, perhaps by adapting to the usage environment. Furthermore, new techniques for creating mobile user interfaces utilising various modalities are proposed. The framework has evolved from the experiences gathered from the designs of experimental and actual product-level uni- and multimodal user interface solutions for mobile devices. It has generated novel multimodal interaction and interface techniques that can be used as building blocks in system designs.