Humanoid robots will increasingly become a part of human everyday lives, but require more natural and simplified methods for control and humanrobot interaction. Our approach to addressing these challenges is to use biologically inspired notions of behavior-based control, and endow robots with the ability to imitate, so that they can be programmed and interacted with through demonstration and imitation. Our approach to the problem, based on neuroscience evidence, structures the motor system into a collection of primitives, which are then used to both generate the humanoid’s movelnent repertoire and provide prediction and classification capabilities for visual perception and interpretation. Thus, what the humanoid can do helps it tmderstand what it sees, and vice versa. 'Ve describe the behavior-based background of our work and the neuroscience evidence on which our humanoid motor control and imitation model is based. Next we describe our use of human movement data as input and the huinanoid simulation test-bed for evaluation. We follow with a detailed discussion of three means of deriving primitives, the key component of our model, and describe implementations for each of them, as well as experimental results, demonstrated using human movement, captured with vision or magnetic markers, and imitated on a humanoid torso with dynamics, performing various movements from dance and athletics.