The MAUI Project: Building MultiModal Affective User Interfaces for Everyone

Christine Laetitia Lisetti, Cynthia LeRouge, and Fatma Nasoz

We address some of the current challenges in intelligent interfaces for universal access and present the design of an intelligent interface which is aimed at 1) input processing the user’s sensory modalities (or modes) via various media, 2) building (or encoding) a model of user’s emotions (MOUE) and 3) adapting its multimedia output to provide the user with an easier and more natural technology access and interaction. We identify key research issues relating to one particularly rich apphcation of MOUE, namely home health care provided via telemedieine.

This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.