ArticuLab

HCII ~ School of Computer Science ~ Carnegie Mellon University

NUMACK

NUMACK Gesturing
NUMACK (Northwestern University Multimodal Autonomous Conversational Kiosk) is an Embodied Conversation Agent (ECA) who gives directions around Northwestern's Campus using a combination of speech, gestures and facial expressions. The system is capable of interacting with human users by generating novel language and gestures in coordination using a grammar-based, computational model of language and a gesture planning system. These systems work in coordination to express information about the real world from a domain knowledge base and an evolving model of context, or information state. NUMACK's verbal, non-verbal and multimodal behaviors are realized using automatically synthesized speech and a kinematic body model. The system updates its model of context and the world by fusing multimodal user input, in the form of head movements, through a stereoscopic, head-tracking system, speech, through automatic speech recognition, and pen input. Click here for publications.