Show simple item record  

dc.contributor.authorJones, Steveen_US
dc.contributor.authorPaynter, Gordon W.en_US
dc.date.accessioned2008-03-19T04:58:07Z
dc.date.available2007-07-22en_US
dc.date.available2008-03-19T04:58:07Z
dc.date.issued2001-02-01en_US
dc.identifier.citationJones, S. & Paynter, G.W. (2001). Human evaluation of Kea, an automatic keyphrasing system. (Working paper series. University of Waikato, Department of Computer Science. No. 01/2/2001). Hamilton, New Zealand: University of Waikato.en_US
dc.identifier.urihttps://hdl.handle.net/10289/41
dc.description.abstractThis paper describes an evaluation of the Kea automatic keyphrase extraction algorithm. Tools that automatically identify keyphrases are desirable because document keyphrases have numerous applications in digital library systems, but are costly and time consuming to manually assign. Keyphrase extraction algorithms are usually evaluated by comparison to author-specified keywords, but this methodology has several well-known shortcomings. The results presented in this paper are based on subjective evaluations of the quality and appropriateness of keyphrases by human assessors, and make a number of contributions. First, they validate previous evaluations of Kea that rely on author keywords. Second, they show Kea's performance is comparable to that of similar systems that have been evaluated by human assessors. Finally, they justify the use of author keyphrases as a performance metric by showing that authors generally choose good keywords.en_US
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.publisherUniversity of Waikato, Department of Computer Science
dc.relation.ispartofseriesComputer Science Working Papers
dc.subjectkeyphrase extractionen_US
dc.subjectauthor keyphrasesen_US
dc.subjectdigital librariesen_US
dc.titleHuman evaluation of Kea, an automatic keyphrasing system.en_US
dc.typeWorking Paperen_US
uow.relation.series01/2


Files in this item

This item appears in the following Collection(s)

Show simple item record