A Platform for Gaze-Contingent Virtual Environments

Robert Danforth, Andrew Duchowski, Robert Geist, and Elizabeth McAliley

This paper describes hardware and software requirements for the development of a gaze-contingent virtual reality system which incorporates several cues to presence. The user’s gaze direction, as well as head position and orientation, are tracked to allow dynamic level-of-detail changes for rendering. Users can see themselves, rather than representations thereof, within blue-screened virtual environments, and limited vestibular feedback is provided through a motion simulator. The aesthetic appearance of environments is driven by advanced graphical techniques (i.e., radiosity) motivated by the goal of photorealistic representation of natural scenes. Taken together, the components identified in this paper describe a platform suitable for development of a variety of "smart" virtual environments.

This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.