Show simple item record  

dc.contributor.authorvan Rijn, Jan N.en_NZ
dc.contributor.authorHolmes, Geoffreyen_NZ
dc.contributor.authorPfahringer, Bernharden_NZ
dc.contributor.authorVanschoren, Joaquinen_NZ
dc.coverage.spatialDelft, Netherlandsen_NZ
dc.date.accessioned2015-11-20T02:11:12Z
dc.date.available2015en_NZ
dc.date.available2015-11-20T02:11:12Z
dc.date.issued2015en_NZ
dc.identifier.citationvan Rijn, J. N., Holmes, G., Pfahringer, B., & Vanschoren, J. (2015). Case study on bagging stable classifiers for data streams. In Twenty-fourth Belgian-Dutch Conference on Machine Learning. Delft, Netherlands.en
dc.identifier.urihttps://hdl.handle.net/10289/9765
dc.description.abstractEnsembles of classifiers are among the strongest classi-fiers in most data mining applications. Bagging ensembles exploit the instability of base-classifiers by training them on different bootstrap replicates. It has been shown that Bagging instable classifiers, such as decision trees, yield generally good results, whereas bagging stable classifiers, such as k-NN, makes little difference. However, recent work suggests that this cognition applies to the classical batch data mining setting rather than the data stream setting. We present an empirical study that supports this observation.
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.rights©2015 copyright with the author.
dc.sourceBENELEARN 2015en_NZ
dc.subjectMachine learning
dc.titleCase study on bagging stable classifiers for data streamsen_NZ
dc.typeConference Contribution
dc.relation.isPartOfTwenty-fourth Belgian-Dutch Conference on Machine Learningen_NZ
pubs.elements-id133356
pubs.finish-date2015-06-19en_NZ
pubs.start-date2015-06-19en_NZ


Files in this item

This item appears in the following Collection(s)

Show simple item record