Multimodal Interfaces for Cell Phones and Mobile Technology |
| |
Authors: | Sharon Oviatt Rebecca Lunsford |
| |
Institution: | (1) Center for Human-Computer Communication, Department of Computer Science & Engineering, Oregon Health & Science University, 20000 NW Walker Road, Beaverton, 97006, OR, USA |
| |
Abstract: | By modeling users' natural spoken and multimodal communication patterns, more powerful and highly reliable interfaces can
be designed that support emerging mobile technology. In this paper, we highlight three different examples of research that
is advancing state-of-the-art mobile technology. The first is the development of fusion-based multimodal systems, such as
ones that combine speech and pen or touch input, which are substantially improving the robustness and stability of system
recognition. The second is modeling of multimodal communication patterns to establish open-microphone engagement techniques
that work in challenging multi-person mobile settings. The third is new approaches to adaptive processing, which are able
to transparently guide user input to match system processing capabilities. All three research directions are contributing
to the design of more reliable, usable, and commercially promising mobile systems of the future. |
| |
Keywords: | multimodal fusion mobile interfaces open microphone engagement cognitive modeling adaptive speech processing |
本文献已被 SpringerLink 等数据库收录! |
|