Within the machine learning community, ensemble learning methods---methods that combine multiple learned models---are gaining popularity because they tend to have better generalization performance than single models. However, ensemble models have so far only been learned in batch mode---all of the training examples are processed as a set multiple times. Online learning attempts to learn models by processing the training examples only once in order. My thesis will present a framework combining online learning and ensemble learning. To that end, I have so far developed two online ensemble learning algorithms--- online versions of the popular bagging and boosting algorithms. I have shown empirically that both online algorithms converge to the same prediction performance as the batch versions and proved this convergence for online bagging. My online ensemble learning framework will enable ensemble learning algorithms to be usable in data mining tasks where datasets are often too large for batch algorithms to handle.