Show simple item record  

dc.contributor.authorHolmes, Geoffrey
dc.contributor.authorPfahringer, Bernhard
dc.contributor.authorKirkby, Richard Brendon
dc.contributor.authorFrank, Eibe
dc.contributor.authorHall, Mark A.
dc.coverage.spatialConference held at Helsinki, Finlanden_NZ
dc.identifier.citationHolmes, G., Pfahringer, B., Kirkby, R., Frank, E. & Hall, M. (2002). Multiclass alternating decision trees. In T. Elomaa et al (Eds), Proceedings of 13th European Conference on Machine Learning Helsinki, Finland, August 19–23, 2002(pp. 105-122). Berlin: Springer.en_US
dc.description.abstractThe alternating decision tree (ADTree) is a successful classification technique that combines decision trees with the predictive accuracy of boosting into a set of interpretable classification rules. The original formulation of the tree induction algorithm restricted attention to binary classification problems. This paper empirically evaluates several wrapper methods for extending the algorithm to the multiclass case by splitting the problem into several two-class problems. Seeking a more natural solution we then adapt the multiclass LogitBoost and AdaBoost.MH procedures to induce alternating decision trees directly. Experimental results confirm that these procedures are comparable with wrapper methods that are based on the original ADTree formulation in accuracy, while inducing much smaller trees.en_US
dc.publisherSpringer, Berlinen_US
dc.sourceECML 2002en_NZ
dc.subjectcomputer scienceen_US
dc.subjectdocument profilingen_US
dc.subjecttext classificationen_US
dc.subjectMachine learning
dc.subjectMachine learning
dc.titleMulticlass alternating decision treesen_US
dc.typeConference Contributionen_US
dc.relation.isPartOfProc 13th European Conference on Machine Learningen_NZ
pubs.volumeLNCS 2430en_NZ

Files in this item


There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record