Proceedings:
Proceedings of the Twentieth International Conference on Machine Learning
Volume
Issue:
Proceedings of the Twentieth International Conference on Machine Learning
Track:
Contents
Downloads:
Abstract:
Theoretical and experimental analyses bagging indicate that it is primarily a variance reduction technique. This suggests that bagging should be applied to learning algorithms tuned to minimize bias, even at cost of some increase in variance. We test idea with Support Vector Machines (SVMs) by employing out-of-bag estimates of and variance to tune the SVMs. Experiments indicate that bagging of low-bias SVMs "lobag" algorithm) never hurts generalization performance and often improves it compared with well-tuned single SVMs and bags of individually well-tuned SVMs.
ICML
Proceedings of the Twentieth International Conference on Machine Learning