Show simple item record  

dc.contributor.authorGomes, Heitor Muriloen_NZ
dc.contributor.authorMontiel, Jacoben_NZ
dc.contributor.authorMastelini, Saulo Martielloen_NZ
dc.contributor.authorPfahringer, Bernharden_NZ
dc.contributor.authorBifet, Alberten_NZ
dc.coverage.spatialELECTR NETWORKen_NZ
dc.date.accessioned2021-04-14T21:20:00Z
dc.date.available2021-04-14T21:20:00Z
dc.date.issued2020en_NZ
dc.identifier.citationGomes, H. M., Montiel, J., Mastelini, S. M., Pfahringer, B., & Bifet, A. (2020). On ensemble techniques for data stream regression. In Proceedings of 2020 International Joint Conference on Neural Networks (IJCNN). Washington, DC, USA: IEEE. https://doi.org/10.1109/IJCNN48605.2020.9206756en
dc.identifier.issn2161-4393en_NZ
dc.identifier.urihttps://hdl.handle.net/10289/14237
dc.description.abstractAn ensemble of learners tends to exceed the predictive performance of individual learners. This approach has been explored for both batch and online learning. Ensembles methods applied to data stream classification were thoroughly investigated over the years, while their regression counterparts received less attention in comparison. In this work, we discuss and analyze several techniques for generating, aggregating, and updating ensembles of regressors for evolving data streams. We investigate the impact of different strategies for inducing diversity into the ensemble by randomizing the input data (resampling, random subspaces and random patches). On top of that, we devote particular attention to techniques that adapt the ensemble model in response to concept drifts, including adaptive window approaches, fixed periodical resets and randomly determined windows. Extensive empirical experiments show that simple techniques can obtain similar predictive performance to sophisticated algorithms that rely on reactive adaptation (i.e., concept drift detection and recovery).
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.publisherIEEEen_NZ
dc.rights© 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
dc.sourceInternational Joint Conference on Neural Networks (IJCNN) held as part of the IEEE World Congress on Computational Intelligence (IEEE WCCI)en_NZ
dc.subjectScience & Technologyen_NZ
dc.subjectTechnologyen_NZ
dc.subjectComputer Science, Artificial Intelligenceen_NZ
dc.subjectComputer Science, Hardware & Architectureen_NZ
dc.subjectComputer Scienceen_NZ
dc.subjectdata streamsen_NZ
dc.subjectregressionen_NZ
dc.subjectensembleen_NZ
dc.subjectrandom patchesen_NZ
dc.subjectrandom subspacesen_NZ
dc.titleOn ensemble techniques for data stream regressionen_NZ
dc.typeConference Contribution
dc.identifier.doi10.1109/IJCNN48605.2020.9206756
dc.relation.isPartOfProceedings of 2020 International Joint Conference on Neural Networks (IJCNN)en_NZ
pubs.elements-id258073
pubs.finish-date2020-07-24en_NZ
pubs.place-of-publicationWashington, DC, USA
pubs.publication-statusPublisheden_NZ
pubs.start-date2020-07-19en_NZ


Files in this item

This item appears in the following Collection(s)

Show simple item record