The control architecture for an intelligent, fully autonomous mobile robot is based on the philosophical view of combining reflexive behaviors and cognitive modules into situated agents in a complementary fashion, supported by a knowledgebased perceptual system. Behavior, cognitive, and perceptual subsystems make up the agent’s intelligence architecture. Each of these three principal components decompose into layered, independent, parallel, distributed functions. The behavior based component of the system provides the basic instinctive competences for the robot while the cognitive part manipulates perceptual knowledge representations and a reasoning mechanism which performs higher machine intelligence functions such as planning. Cognitive control directly affects behavior through motivation inputs to behavior functions and through output behavior arbitration. A cognitive planning activity can execute plans merely by setting the motivation state of the robot and letting the behavior-based subsystem worry about the details of plan execution. The perceptual system offers a general framework for sensory knowledge generation, abstraction, and integration. All information in the perception knowledge bases derives from the fusing of real-time sensor data in a model-driven, multisensor system. This approach provides a current, consensual, and consistent interpretation of the environment observed by the agent. The perceptual module effects behavior through virtual sensor inputs supplied to behavior algorithms. The cognitive and perceptual modules can be influenced by behavior status inputs.