Publication
IUI 2005
Conference paper

Two-way adaptation for robust input interpretation in practical multimodal conversation systems

View publication

Abstract

Multimodal conversation systems allow users to interact with computers effectively using multiple modalities, such as natural language and gesture. However, these systems have not been widely used in practical applications mainly due to their limited input understanding capability. As a result, conversation systems often fail to understand user requests and leave users frustrated. To address this issue, most existing approaches focus on improving a system's interpretation capability. Nonetheless, such improvements may still be limited, since they would never cover the entire range of input expressions. Alternatively, we present a two-way adaptation framework that allows both users and systems to dynamically adapt to each other's capability and needs during the course of interaction. Compared to existing methods, our approach offers two unique contributions. First, it improves the usability and robustness of a conversation system by helping users to dynamically learn the system's capabilities in context. Second, our approach enhances the overall interpretation capability of a conversation system by learning new user expressions on the fly. Our preliminary evaluation shows the promise of this approach. Copyright © 2005 ACM.

Date

Publication

IUI 2005

Authors

Topics

Share