A COSPAL Subsystem: Solving a Shape-Sorter Puzzle

Michael Felsberg, Per-Erik Forssén, Anders Moe, and Gösta Granlund

To program a robot to solve a simple shape-sorter puz- zle is trivial. To devise a Cognitive System Architec- ture, which allows the system to find out by itself how to go about a solution, is less than trivial. The development of such an architecture is one of the aims of the COSPAL project, leading to new techniques in vision based Artificial Cognitive Systems, which al- low the development of robust systems for real dynamic environments. The systems developed under the project itself remain however in simplified scenarios, likewise the shape-sorter problem described in the present paper. The key property of the described system is its robust- ness. Since we apply association strategies of local fea- tures, the system behaves robustly under a wide range of distortions, as occlusion, colour and intensity changes. The segmentation step which is applied in many sys- tems known from literature is replaced with local as- sociations and view-based hypothesis validation. The hypotheses used in our system are based on the antic- ipated state of the visual percepts. This state replaces explicit modeling of shapes. The current state is cho- sen by a voting system and verified against the true vi- sual percepts. The anticipated state is obtained from the association to the manipulator actions, where rein- forcement learning replaces the explicit calculation of actions. These three differences to classical schemes allow the design of a much more generic and flexible system with a high level of robustness. On the technical side, the channel representation of in- formation and associative learning in terms of the chan- nel learning architecture are essential ingredients for the system. It is the properties of locality, smoothness, and non-negativity which make these techniques suitable for this kind of application. The paper gives brief descrip- tions of how different system parts have been imple- mented and show some examples from our tests.


This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.