Loading...
Thumbnail Image
Abstract
An ensemble of learners tends to exceed the predictive performance of individual learners. This approach has been explored for both batch and online learning. Ensembles methods applied to data stream classification were thoroughly investigated over the years, while their regression counterparts received less attention in comparison. In this work, we discuss and analyze several techniques for generating, aggregating, and updating ensembles of regressors for evolving data streams. We investigate the impact of different strategies for inducing diversity into the ensemble by randomizing the input data (resampling, random subspaces and random patches). On top of that, we devote particular attention to techniques that adapt the ensemble model in response to concept drifts, including adaptive window approaches, fixed periodical resets and randomly determined windows. Extensive empirical experiments show that simple techniques can obtain similar predictive performance to sophisticated algorithms that rely on reactive adaptation (i.e., concept drift detection and recovery).
Type
Conference Contribution
Type of thesis
Series
Citation
Gomes, H. M., Montiel, J., Mastelini, S. M., Pfahringer, B., & Bifet, A. (2020). On ensemble techniques for data stream regression. In Proceedings of 2020 International Joint Conference on Neural Networks (IJCNN). Washington, DC, USA: IEEE. https://doi.org/10.1109/IJCNN48605.2020.9206756
Date
2020
Publisher
IEEE
Degree
Supervisors
Rights
© 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.