In this paper we present an approach to symbol anchoring that is based on mapping sequences of distance measurements from simple sensors. The sensory space of the mobile robot is pre-structured according to its experiences when it first moves around in unexplored environments. Such pre-structuring depends not only on environmental features, but also on the type of behaviour the robot exhibits. Object representations correspond to streams of sensory signals that are mapped onto this sensory space and classified by a sequence detection mechanism. We report on novel experimental results using this technique where we compare variants of the approach and other more simple methods. We present data from experiments with varied parameters and input data types such as motor and distance sensor information. In our real mobile robot scenario the robot successfully discriminates a number of objects that can then be anchored in a second step using input from a human supervisor.