AAAI Publications, Workshops at the Twenty-Fourth AAAI Conference on Artificial Intelligence

Font Size: 
Verbal Assistance in Tactile-Map Explorations: A Case for Visual Representations and Reasoning
Christopher Habel, Matthias Kerzel, Kris Lohmann

Last modified: 2010-07-07


Tactile maps offer access to spatial-analog information for visually impaired people. In contrast to visual maps, a tactile map has a lower resolution and can only be inspected in a sequential way, complicating the extraction of spatial relations among distant map entities. Verbal assistance can help to overcome these difficulties by substituting textual labels with verbal descriptions and offering propositional knowledge about spatial relations. Like visual maps, tactile maps are based on visual, spatial-geometric representations that need to be reasoned about in order to generate verbal assistance. We present an approach towards a verbally assisting virtual-environment tactile map (VAVETaM) realized on a computer system utilizing a haptic force-feedback device. In particular, we discuss the tasks of understanding the user's map exploration procedures (MEPs), of exploiting the spatial-analog map to anticipate the user's informational needs, of reasoning about optimal assistance by taking assumed prior knowledge of the user into account, and of generating appropriate verbal instructions and descriptions to augment the map.

Full Text: PDF