Case study on bagging stable classifiers for data streams

Abstract

Ensembles of classifiers are among the strongest classi-fiers in most data mining applications. Bagging ensembles exploit the instability of base-classifiers by training them on different bootstrap replicates. It has been shown that Bagging instable classifiers, such as decision trees, yield generally good results, whereas bagging stable classifiers, such as k-NN, makes little difference. However, recent work suggests that this cognition applies to the classical batch data mining setting rather than the data stream setting. We present an empirical study that supports this observation.

Citation

van Rijn, J. N., Holmes, G., Pfahringer, B., & Vanschoren, J. (2015). Case study on bagging stable classifiers for data streams. In Twenty-fourth Belgian-Dutch Conference on Machine Learning. Delft, Netherlands.

Series name

Date

Publisher

Degree

Type of thesis

Supervisor