Published:
May 2001
Proceedings:
Proceedings of the Fourteenth International Florida Artificial Intelligence Research Society Conference (FLAIRS 2001)
Volume
Issue:
Proceedings of the Fourteenth International Florida Artificial Intelligence Research Society Conference (FLAIRS 2001)
Track:
All Papers
Downloads:
Abstract:
Theoretically well-founded, Support Vector Machines (SVM) are well-known to be suited for efficiently solving classification problems. Although improved generalization is the main goal of this new type of learning machine, recent works have tried to use them differently. For instance, feature selection has been recently viewed as an indirect consequence of the SVM approach. In this paper, we also exploit SVMs differently from what they are originally intended. We investigate them as a data reduction technique, useful for improving case-based learning algorithms, sensitive to noise and computationally expensive. Adopting the margin maximization principle for reducing the Structural Risk, our strategy allows not only to eliminate irrelevant instances but also to improve the performances of the standard k-Nearest-Neighbor classifier. A wide comparative study is presented on several benchmarks of UCI repository, showing the utility of our approach.
FLAIRS
Proceedings of the Fourteenth International Florida Artificial Intelligence Research Society Conference (FLAIRS 2001)
ISBN 978-1-57735-133-7
Published by The AAAI Press, Menlo Park, California.