Publication
Proceedings - IEEE International Conference on Multimedia and Expo
Paper

Single application model, multiple synchronized views

View publication

Abstract

User interface is a mean to an end-its primary goal is to capture user intent and communicate the results of the requested computation. On today's devices, user interaction can be achieved through a multiplicity of interaction modalities including speech and visual interfaces. As we evolve toward an increasingly connected world where we access and interact with applications through multiple devices, it becomes crucial that the various access paths to the underlying content be synchronized. This synchronization ensures that the user interacts with the same underlying content independent of the interaction modality-despite the difference in presentation that each modality might impose. It also ensures that the effect of user interaction in any given modality is reflected consistently across all available modalities. We describe an application framework that enables tightly synchronized multimodal user interaction. This framework derives its power from representing the application model in a modality-independent manner, and by traversing this model to produce the various synchronized multimodal views. As the user interaction proceeds, we maintain our current position in the model and update the application data as determined by user intent, then reflect these updates in the various views being presented. We conclude the paper by outlining an example that demonstrates this tightly synchronized multimodal interaction, and describe some of the future challenges in building such multimodal frameworks. © 2001 IEEE.