This paper describes hardware and software requirements for the development of a gaze-contingent virtual reality system which incorporates several cues to presence. The user’s gaze direction, as well as head position and orientation, are tracked to allow dynamic level-of-detail changes for rendering. Users can see themselves, rather than representations thereof, within blue-screened virtual environments, and limited vestibular feedback is provided through a motion simulator. The aesthetic appearance of environments is driven by advanced graphical techniques (i.e., radiosity) motivated by the goal of photorealistic representation of natural scenes. Taken together, the components identified in this paper describe a platform suitable for development of a variety of "smart" virtual environments.