The main difficulties that researchers face in understanding emotions are difficulties only because of the narrow-mindedness of our views on emotions. We are not able to free ourselves from the notion that emotions are necessarily human emotions. I will argue that if animals have emotions, then so can robots. Studies in neuroscience have shown that animal models, though having limitations, have significantly contributed to our understanding of the functional and mechanistic aspects of emotions. I will suggest that one of the main functions of emotions is to achieve the multi-level communication of simplified but high impact information. The way this function is achieved in the brain depends on the species, and on the specific emotion considered. The classical view that emotions are computed by specialized brain centers, such as the limbic system, is criticized. I will suggest that an ensemble of well-known neurobiological phenomena, together referred to as neuromodulation, provide a useful framework for understanding how emotions arise, are maintained, and interact with other aspects of behavior and cognitive processing. This framework suggests new ways in which robot emotions can be implemented and fulfill their function.