Proceedings:
Engineering
Volume
Issue:
Proceedings of the AAAI Conference on Artificial Intelligence, 5
Track:
AI Language and Architectures
Downloads:
Abstract:
Recently researchers have suggested several computational models in which, one programs by specifying large networks of simple devices. Such models are interesting because they go to the roots of concurrency - the circuit level. A problem with the models is that it is unclear how to program large systems and expensive to implement many features that are taken for granted m symbolic programming languages. This paper describes the Concurrent Inference System (CIS), and its implementation on a massively concurrent network model of computation. It shows how much of the functionality of current rule-based systems can be implemented in a straightforward manner within such models. Unlike conventional implementations of rule-based systems in which the inference engine and rule sets are clearly divided at run time, CIS compiles the rules into a large static concurrent network of very simple devices. In this network the rules and inference engine are no longer distinct. The Thinking Machines Corporation, Connection Machine - a 65,536 processor SIMD computer - is then used to run the network. On the current implementation, real time user system interaction is possible with up to 100,000 rules.
AAAI
Proceedings of the AAAI Conference on Artificial Intelligence, 5