dc.contributor.author | Gomes, Heitor Murilo | en_NZ |
dc.contributor.author | Montiel, Jacob | en_NZ |
dc.contributor.author | Mastelini, Saulo Martiello | en_NZ |
dc.contributor.author | Pfahringer, Bernhard | en_NZ |
dc.contributor.author | Bifet, Albert | en_NZ |
dc.coverage.spatial | ELECTR NETWORK | en_NZ |
dc.date.accessioned | 2021-04-14T21:20:00Z | |
dc.date.available | 2021-04-14T21:20:00Z | |
dc.date.issued | 2020 | en_NZ |
dc.identifier.citation | Gomes, H. M., Montiel, J., Mastelini, S. M., Pfahringer, B., & Bifet, A. (2020). On ensemble techniques for data stream regression. In Proceedings of 2020 International Joint Conference on Neural Networks (IJCNN). Washington, DC, USA: IEEE. https://doi.org/10.1109/IJCNN48605.2020.9206756 | en |
dc.identifier.issn | 2161-4393 | en_NZ |
dc.identifier.uri | https://hdl.handle.net/10289/14237 | |
dc.description.abstract | An ensemble of learners tends to exceed the predictive performance of individual learners. This approach has been explored for both batch and online learning. Ensembles methods applied to data stream classification were thoroughly investigated over the years, while their regression counterparts received less attention in comparison. In this work, we discuss and analyze several techniques for generating, aggregating, and updating ensembles of regressors for evolving data streams. We investigate the impact of different strategies for inducing diversity into the ensemble by randomizing the input data (resampling, random subspaces and random patches). On top of that, we devote particular attention to techniques that adapt the ensemble model in response to concept drifts, including adaptive window approaches, fixed periodical resets and randomly determined windows. Extensive empirical experiments show that simple techniques can obtain similar predictive performance to sophisticated algorithms that rely on reactive adaptation (i.e., concept drift detection and recovery). | |
dc.format.mimetype | application/pdf | |
dc.language.iso | en | |
dc.publisher | IEEE | en_NZ |
dc.rights | © 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. | |
dc.source | International Joint Conference on Neural Networks (IJCNN) held as part of the IEEE World Congress on Computational Intelligence (IEEE WCCI) | en_NZ |
dc.subject | Science & Technology | en_NZ |
dc.subject | Technology | en_NZ |
dc.subject | Computer Science, Artificial Intelligence | en_NZ |
dc.subject | Computer Science, Hardware & Architecture | en_NZ |
dc.subject | Computer Science | en_NZ |
dc.subject | data streams | en_NZ |
dc.subject | regression | en_NZ |
dc.subject | ensemble | en_NZ |
dc.subject | random patches | en_NZ |
dc.subject | random subspaces | en_NZ |
dc.title | On ensemble techniques for data stream regression | en_NZ |
dc.type | Conference Contribution | |
dc.identifier.doi | 10.1109/IJCNN48605.2020.9206756 | |
dc.relation.isPartOf | Proceedings of 2020 International Joint Conference on Neural Networks (IJCNN) | en_NZ |
pubs.elements-id | 258073 | |
pubs.finish-date | 2020-07-24 | en_NZ |
pubs.place-of-publication | Washington, DC, USA | |
pubs.publication-status | Published | en_NZ |
pubs.start-date | 2020-07-19 | en_NZ |