dc.contributor.author | Barddal, Jean Paul | en_NZ |
dc.contributor.author | Gomes, Heitor Murilo | en_NZ |
dc.contributor.author | Enembreck, Fabrício | en_NZ |
dc.contributor.author | Pfahringer, Bernhard | en_NZ |
dc.contributor.author | Bifet, Albert | en_NZ |
dc.contributor.editor | Frasconi, Paolo | en_NZ |
dc.contributor.editor | Landwehr, Niels | en_NZ |
dc.contributor.editor | Manco, Giuseppe | en_NZ |
dc.contributor.editor | Vreeken, Jilles | en_NZ |
dc.coverage.spatial | Riva del Garda, Italy | en_NZ |
dc.date.accessioned | 2017-05-04T03:51:49Z | |
dc.date.available | 2016 | en_NZ |
dc.date.available | 2017-05-04T03:51:49Z | |
dc.date.issued | 2016 | en_NZ |
dc.identifier.citation | Barddal, J. P., Gomes, H. M., Enembreck, F., Pfahringer, B., & Bifet, A. (2016). On dynamic feature weighting for feature drifting data streams. In P. Frasconi, N. Landwehr, G. Manco, & J. Vreeken (Eds.), Proceedings of European Conference on Machine Learning and Knowledge Discovery in Databases (Vol. LNAI 9852, pp. 129–144). Cham, Switzerland: Springer. https://doi.org/10.1007/978-3-319-46227-1_9 | en |
dc.identifier.isbn | 9783319462264 | en_NZ |
dc.identifier.issn | 0302-9743 | en_NZ |
dc.identifier.uri | https://hdl.handle.net/10289/11028 | |
dc.description.abstract | The ubiquity of data streams has been encouraging the development of new incremental and adaptive learning algorithms. Data stream learners must be fast, memory-bounded, but mainly, tailored to adapt to possible changes in the data distribution, a phenomenon named concept drift. Recently, several works have shown the impact of a so far nearly neglected type of drifcccct: feature drifts. Feature drifts occur whenever a subset of features becomes, or ceases to be, relevant to the learning task. In this paper we (i) provide insights into how the relevance of features can be tracked as a stream progresses according to information theoretical Symmetrical Uncertainty; and (ii) how it can be used to boost two learning schemes: Naive Bayesian and k-Nearest Neighbor. Furthermore, we investigate the usage of these two new dynamically weighted learners as prediction models in the leaves of the Hoeffding Adaptive Tree classifier. Results show improvements in accuracy (an average of 10.69% for k-Nearest Neighbor, 6.23% for Naive Bayes and 4.42% for Hoeffding Adaptive Trees) in both synthetic and real-world datasets at the expense of a bounded increase in both memory consumption and processing time. | en_NZ |
dc.format.mimetype | application/pdf | |
dc.language.iso | en | |
dc.publisher | Springer | en_NZ |
dc.rights | © 2016 Springer International Publishing Switzerland.This is the author's accepted version. The final publication is available at Springer via dx.doi.org/10.1007/978-3-319-46227-1_9 | |
dc.source | ECML PKDD 2016 | en_NZ |
dc.subject | computer science | |
dc.subject | data stream mining | |
dc.subject | concept drift | |
dc.subject | feature drift | |
dc.subject | feature weighting | |
dc.subject | Machine learning | |
dc.title | On dynamic feature weighting for feature drifting data streams | en_NZ |
dc.type | Conference Contribution | |
dc.identifier.doi | 10.1007/978-3-319-46227-1_9 | en_NZ |
dc.relation.isPartOf | Proceedings of European Conference on Machine Learning and Knowledge Discovery in Databases | en_NZ |
pubs.begin-page | 129 | |
pubs.elements-id | 142693 | |
pubs.end-page | 144 | |
pubs.finish-date | 2016-09-23 | en_NZ |
pubs.place-of-publication | Cham, Switzerland | |
pubs.start-date | 2016-09-19 | en_NZ |
pubs.volume | LNAI 9852 | en_NZ |
dc.identifier.eissn | 1611-3349 | en_NZ |