Gunasekara, Nuwan AmilaPfahringer, BernhardGomes, HBifet, Albert2024-10-102024-10-102024Gunasekara, N., Pfahringer, B., Gomes, H., & Bifet, A. (2024). Gradient boosted trees for evolving data streams. Machine Learning, 113, 3325-3352. https://doi.org/10.1007/s10994-024-06517-y0885-6125https://hdl.handle.net/10289/16980Gradient Boosting is a widely-used machine learning technique that has proven highly effective in batch learning. However, its effectiveness in stream learning contexts lags behind bagging-based ensemble methods, which currently dominate the field. One reason for this discrepancy is the challenge of adapting the booster to new concept following a concept drift. Resetting the entire booster can lead to significant performance degradation as it struggles to learn the new concept. Resetting only some parts of the booster can be more effective, but identifying which parts to reset is difficult, given that each boosting step builds on the previous prediction. To overcome these difficulties, we propose Streaming Gradient Boosted Trees (Sgbt), which is trained using weighted squared loss elicited in XGBoost. Sgbt exploits trees with a replacement strategy to detect and recover from drifts, thus enabling the ensemble to adapt without sacrificing the predictive performance. Our empirical evaluation of Sgbt on a range of streaming datasets with challenging drift scenarios demonstrates that it outperforms current state-of-the-art methods for evolving data streams.enThis article is licensed under a Creative Commons Attribution 4.0 International Licensehttp://creativecommons.org/licenses/by/4.0/computer scienceconcept driftgradient boosted treesgradient boostingmachine learningstream learningGradient boosted trees for evolving data streamsJournal Article10.1007/s10994-024-06517-y1573-056546 Information and Computing Sciences4611 Machine Learning4611 Machine learning