Alternating model trees
Files
Submitted version, 330.2Kb
Citation
Export citationFrank, E., Mayo, M., & Kramer, S. (2015). Alternating model trees. In Proc 30th ACM Symposium on Applied Computing, Data Mining Track. Salamanca, Spain: ACM Press.
Permanent Research Commons link: https://hdl.handle.net/10289/9398
Abstract
Model tree induction is a popular method for tackling regression problems requiring interpretable models. Model trees are decision trees with multiple linear regression models at the leaf nodes. In this paper, we propose a method for growing alternating model trees, a form of option tree for regression problems. The motivation is that alternating decision trees achieve high accuracy in classification problems because they represent an ensemble classifier as a single tree structure. As in alternating decision trees for classifi-cation, our alternating model trees for regression contain splitter and prediction nodes, but we use simple linear regression functions as opposed to constant predictors at the prediction nodes. Moreover, additive regression using forward stagewise modeling is applied to grow the tree rather than a boosting algorithm. The size of the tree is determined using cross-validation. Our empirical results show that alternating model trees achieve significantly lower squared error than standard model trees on several regression datasets.
Date
2015Publisher
ACM Press
Rights
This is an author’s accepted version of an article in Proceedings of 38th Annual ACM SIGIR conference, Santiago de Chile, Chile. © 2015 ACM.