Show simple item record  

dc.contributor.authorCleary, John G.
dc.contributor.authorTrigg, Leonard E.
dc.date.accessioned2008-10-20T01:39:00Z
dc.date.available2008-10-20T01:39:00Z
dc.date.issued1998-05
dc.identifier.citationCleary, J. G. & Trigg, L. E. (1998). Experiences with a weighted decision tree learner. (Working paper 98/10). Hamilton, New Zealand: University of Waikato, Department of Computer Science.en_US
dc.identifier.issn1170-487X
dc.identifier.urihttps://hdl.handle.net/10289/1055
dc.description.abstractMachine learning algorithms for inferring decision trees typically choose a single “best” tree to describe the training data. Recent research has shown that classification performance can be significantly improved by voting predictions of multiple, independently produced decision trees. This paper describes an algorithm, OB1, that makes a weighted sum over many possible models. We describe one instance of OB1, that includes all possible decision trees as well as naïve Bayesian models. OB1 is compared with a number of other decision tree and instance based learning algorithms on some of the data sets from the UCI repository. Both an information gain and an accuracy measure are used for the comparison. On the information gain measure OB1 performs significantly better than all the other algorithms. On the accuracy measure it is significantly better than all the algorithms except naïve Bayes which performs comparably to OB1.en_US
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.publisherUniversity of Waikato, Department of Computer Scienceen_US
dc.relation.ispartofseriesComputer Science Working Papers
dc.subjectcomputer scienceen_US
dc.subjectoption treesen_US
dc.subjectbayesian statisticsen_US
dc.subjectdecision treesen_US
dc.titleExperiences with a weighted decision tree learneren_US
dc.typeWorking Paperen_US
uow.relation.series98/10
pubs.elements-id54761
pubs.place-of-publicationHamiltonen_NZ


Files in this item

This item appears in the following Collection(s)

Show simple item record