Proceedings:
Innovative Applications of Massive Parallelism
Volume
Issue:
Papers from the 1993 AAAI Spring Symposium
Track:
Contents
Downloads:
Abstract:
Connectionist approaches in artificial intelligence can benefit from implementations that take advantage of massive parallelism. The network topologies encountered in connectionist work often involve such massive parallelism. This paper discusses additional parallelism utilized prior to the actual network implementation. Parallelism is used to facilitate aggressive neural network training algorithms that enhance the accuracy of the resulting network. The computational expense of the training approach to training is minimized through the use of massive parallelism. The training algorithms are being implemented in a prototype for a challenging sensor processing application. Our neural network training approach produces networks that are superior to the limited accuracy of typical connectionist applications. The approach addresses function complexity, network capacity, and training algorithm aggressiveness. Symbolicomputing tools are used to develop algebraic representations for the gradient and Hessian of the least-squares cost functions for feed-forward network topologies, with fully connected and locally-connected layers. These representations are then transformed into block-structured form. They are then integrated with a full-Newton network training algorithm and executed on vector/parallel computers.
Spring
Papers from the 1993 AAAI Spring Symposium