Theory combination: an alternative to data combination
dc.contributor.author | Ting, Kai Ming | |
dc.contributor.author | Low, Boon Toh | |
dc.date.accessioned | 2008-10-28T23:01:52Z | |
dc.date.available | 2008-10-28T23:01:52Z | |
dc.date.issued | 1996-10 | |
dc.description.abstract | The approach of combining theories learned from multiple batches of data provide an alternative to the common practice of learning one theory from all the available data (i.e., the data combination approach). This paper empirically examines the base-line behaviour of the theory combination approach in classification tasks. We find that theory combination can lead to better performance even if the disjoint batches of data are drawn randomly from a larger sample, and relate the relative performance of the two approaches to the learning curve of the classifier used. | en_US |
dc.format.mimetype | application/pdf | |
dc.identifier.citation | Ting, K. M. & Low, B. T. (1996). Theory combination: an alternative to data combination. (Working paper 96/19). Hamilton, New Zealand: University of Waikato, Department of Computer Science. | en_US |
dc.identifier.issn | 1170-487X | |
dc.identifier.uri | https://hdl.handle.net/10289/1172 | |
dc.language.iso | en | |
dc.relation.ispartofseries | Computer Science Working Papers | |
dc.subject | computer science | en_US |
dc.subject | theory combination | en_US |
dc.subject | data combination | en_US |
dc.subject | empirical evaluation | en_US |
dc.subject | learning curve | en_US |
dc.subject | near-asymptotic performance | en_US |
dc.title | Theory combination: an alternative to data combination | en_US |
dc.type | Working Paper | en_US |
uow.relation.series | 96/19 |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- uow-cs-wp-1996-19.pdf
- Size:
- 3.64 MB
- Format:
- Adobe Portable Document Format
License bundle
1 - 1 of 1
No Thumbnail Available
- Name:
- license.txt
- Size:
- 1.79 KB
- Format:
- Item-specific license agreed upon to submission
- Description: