Following recent neurophysiological research, one important role of emotions consists in providing a mechanism for adequate and efficient response to relevant stimuli. In this paper we propose a methodology for implementing such a mechanism, based on a previously presented emotion-based agent model. This model is founded on a double knowledge representation paradigm: a stimulus reaching the agent is processed under two different and simultaneous perspectives -- a simple (termed perceptual) and a complex (termed cognitive) -- from which two differing representation schemata are derived. The perceptual representation is oriented to capturing the relevant aspects of the environment, aiming at a quick response to urgent situations, while the cognitive one is directed towards high-level cognitive processing. These two representations are associated and stored in the agent memory in such a way that, in the future, if the agent is confronted with a similar situation, the perceptual representation will help the retrieval of the cognitive one in an efficient way. This indexing mechanism provides a quick algorithm to find cognitive matches. Furthermore, the matching is assumed to take place on two corresponding metric spaces. This paper addresses a twofold strategy for the construction of a perceptual representation. The first one consists of adapting a perceptual metric, with the goal of approximating it to the cognitive metric. The second one has the goal of upgrading the perceptual representation with additional components (e.g., features). Techniques borrowed from nonmetric Multidimensional Scaling are used to approach these goals.