In previous work (c.f. Spector and Hendler, 1993), we have described a simulated household cleaning domain, with a simulated robot agent which performs cleaning tasks. Although this agent didn’t vaccuum the floor, it was proposed for doing tasks such as putting dirty socks into the hamper, cleaning up spills, and even putting out a fire in the kitchen. The simulator divided the apartment into over twenty thousand spatial units, and sensing and effecting were based on these. Thus, the robot could only see what was in a projected cone of vision, could only move things it could reach, etc. Although a "robot" was used as the domain for this work, and although the simulation was more complex than most AI micro-worlds, the simulator didn’t map well to real robotics. Although this was a fine domain for AI planning research, it’s assumptions were unrealistic for real robotics work -- sensors were assumed to be perfect, effectors had little or no error, and the robot’s perceptions were "symbolic." Our claims that someday a reactive substrate on a real robot might provide something of this sort were not well received by the AI-based mobile robotics people. As one researcher 1 put it, "Buy a robot!" We did. In this brief paper, we describe our attempts to provide a vaccuuming behavior for this robot. We start with a brief description of the robot and the task (as examined so far), and follow with a brief description of our plans for this domain. We conclude with some discussion of the differences between "Itchy," our subsumption-based robot, and "HomeBot," the simulated agent of our previous work, and describe how we are attempting to (eventually) merge the two projects.