Published:
May 2003
Proceedings:
Proceedings of the Sixteenth International Florida Artificial Intelligence Research Society Conference (FLAIRS 2003)
Volume
Issue:
Proceedings of the Sixteenth International Florida Artificial Intelligence Research Society Conference (FLAIRS 2003)
Track:
All Papers
Downloads:
Abstract:
When model input variables appear redundant, it is common practice to simply drop some of them until the redundancy is removed, prior to model identification. As a result, the final model has forgotten the interdependency in the original input data, which may be an essential condition for model validity. We provide a practical approach to neural network modeling, such that the final model will incorporate also a "memory" of the multi-collinearity in training inputs, and provides a check on new input vectors for consistency with this pattern. We approach this problem stepwise, pointing out the benefits achieved or lost at each step when model complexity is increased. The steps lead in a natural way to building implicit models, which also handle noise in inputs in close resemblance to total least squares. The practical tool for this is a feedforward network of specifically selected configuration.
FLAIRS
Proceedings of the Sixteenth International Florida Artificial Intelligence Research Society Conference (FLAIRS 2003)
ISBN 978-1-57735-177-1
Published by The AAAI Press, Menlo Park, California.