We believe that high level tasks, such as strategic planning, are the role of human users, and that the interface should be the tool by which humans can easily, intelligently, and even intuitively communicate their goals to a robot or a team of robots. We will be exhibiting one such interface to control a team of mobile robots. By drawing on the sketch pad of a tablet PC, users can draw environment landmarks and label them, as well as indicate goal points and paths for robot navigation for a single robot or a group of robots. The information that the human presents to the robots via the interface need not be absolute. From the robot’s point of view, the task is based on its real-time sensing and the relative position of paths and landmarks. We are looking into the relationship between the accuracy of the user’s knowledge, the conveyance of that information via the interface, the amount of adaptability required by the system to compensate for inaccuracies, and the workload placed on the user.