We present two new systems that learn to understand natural language instructions and carry out implied sensorimotor tasks. These systems constitute a further step ill our ongoing investigation of acquisition of language by machines [Henis et al. 1994; Henis et al. 1995]. The research approach is based on the idea that language skills and the understanding of the meaning conveyed by spoken communication should be acquired simultaneously and in conjunction with sensorimotor integration. One system is a simulated manipulator in a three dimensional blocks world. The second is a small mobile robot in an office environment. The systems learn on the basis of semantic-level reinforcement feedback signals provided by the user. The associations of words and sensory inputs are derived on the basis of the mutual information between constituents of the inputs and motor outputs. In the course of interacting with the world, an internal representation of the world is constructed that can be manipulated linguistically. The systems starts off without any task relevant words. It takes about 400 sentences (approximately 40 for the simpler mobile robot) for the system to acquire enough knowledge to respond reasonably to the user input, and relate to its environment in terms of given object names, action names and conjunctions of features. The success of these systems show that the principles which formed the basis for our previous systems scale to more complex systems.