Loading...
Thumbnail Image
Item

Case study on bagging stable classifiers for data streams

Abstract
Ensembles of classifiers are among the strongest classi-fiers in most data mining applications. Bagging ensembles exploit the instability of base-classifiers by training them on different bootstrap replicates. It has been shown that Bagging instable classifiers, such as decision trees, yield generally good results, whereas bagging stable classifiers, such as k-NN, makes little difference. However, recent work suggests that this cognition applies to the classical batch data mining setting rather than the data stream setting. We present an empirical study that supports this observation.
Type
Conference Contribution
Type of thesis
Series
Citation
van Rijn, J. N., Holmes, G., Pfahringer, B., & Vanschoren, J. (2015). Case study on bagging stable classifiers for data streams. In Twenty-fourth Belgian-Dutch Conference on Machine Learning. Delft, Netherlands.
Date
2015
Publisher
Degree
Supervisors
Rights
©2015 copyright with the author.