Show simple item record  

dc.contributor.authorMontiel, Jacoben_NZ
dc.contributor.authorMitchell, Roryen_NZ
dc.contributor.authorFrank, Eibeen_NZ
dc.contributor.authorPfahringer, Bernharden_NZ
dc.contributor.authorAbdessalem, Talelen_NZ
dc.contributor.authorBifet, Alberten_NZ
dc.coverage.spatialGlasgow, UKen_NZ
dc.date.accessioned2020-12-06T22:51:55Z
dc.date.available2020-12-06T22:51:55Z
dc.date.issued2020en_NZ
dc.identifier.citationMontiel, J., Mitchell, R., Frank, E., Pfahringer, B., Abdessalem, T., & Bifet, A. (2020). Adaptive XGBoost for evolving data streams. In Proceedings of 2020 International Joint Conference on Neural Networks (IJCNN) (pp. 1–8). Washington, DC, USA: IEEE. https://doi.org/10.1109/IJCNN48605.2020.9207555en
dc.identifier.isbn9781728169262en_NZ
dc.identifier.urihttps://hdl.handle.net/10289/14004
dc.description.abstractBoosting is an ensemble method that combines base models in a sequential manner to achieve high predictive accuracy. A popular learning algorithm based on this ensemble method is eXtreme Gradient Boosting (XGB). We present an adaptation of XGB for classification of evolving data streams. In this setting, new data arrives over time and the relationship between the class and the features may change in the process, thus exhibiting concept drift. The proposed method creates new members of the ensemble from mini-batches of data as new data becomes available. The maximum ensemble size is fixed, but learning does not stop when this size is reached because the ensemble is updated on new data to ensure consistency with the current concept. We also explore the use of concept drift detection to trigger a mechanism to update the ensemble. We test our method on real and synthetic data with concept drift and compare it against batch-incremental and instance-incremental classification methods for data streams.en_NZ
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.publisherIEEEen_NZ
dc.rightsThis is an author’s accepted version of an article published in the Proceedings of 2020 International Joint Conference on Neural Networks (IJCNN). © 2020 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.
dc.sourceIJCNN 2020en_NZ
dc.subjectcomputer scienceen_NZ
dc.subjectensemblesen_NZ
dc.subjectboostingen_NZ
dc.subjectstream learningen_NZ
dc.subjectclassificationen_NZ
dc.titleAdaptive XGBoost for evolving data streamsen_NZ
dc.typeConference Contribution
dc.identifier.doi10.1109/IJCNN48605.2020.9207555en_NZ
dc.relation.isPartOfProceedings of 2020 International Joint Conference on Neural Networks (IJCNN)en_NZ
pubs.begin-page1
pubs.elements-id253774
pubs.end-page8
pubs.finish-date2020-07-24en_NZ
pubs.place-of-publicationWashington, DC, USA
pubs.publication-statusPublisheden_NZ
pubs.start-date2020-07-19en_NZ


Files in this item

This item appears in the following Collection(s)

Show simple item record