Research Commons
      • Browse 
        • Communities & Collections
        • Titles
        • Authors
        • By Issue Date
        • Subjects
        • Types
        • Series
      • Help 
        • About
        • Collection Policy
        • OA Mandate Guidelines
        • Guidelines FAQ
        • Contact Us
      • My Account 
        • Sign In
        • Register
      View Item 
      •   Research Commons
      • University of Waikato Research
      • Computing and Mathematical Sciences
      • Computer Science Working Paper Series
      • 2001 Working Papers
      • View Item
      •   Research Commons
      • University of Waikato Research
      • Computing and Mathematical Sciences
      • Computer Science Working Paper Series
      • 2001 Working Papers
      • View Item
      JavaScript is disabled for your browser. Some features of this site may not work without it.

      Human evaluation of Kea, an automatic keyphrasing system.

      Jones, Steve; Paynter, Gordon W.
      Thumbnail
      Files
      content.pdf
      262.4Kb
      Citation
      Export citation
      Jones, S. & Paynter, G.W. (2001). Human evaluation of Kea, an automatic keyphrasing system. (Working paper series. University of Waikato, Department of Computer Science. No. 01/2/2001). Hamilton, New Zealand: University of Waikato.
      Permanent Research Commons link: https://hdl.handle.net/10289/41
      Abstract
      This paper describes an evaluation of the Kea automatic keyphrase extraction algorithm. Tools that automatically identify keyphrases are desirable because document keyphrases have numerous applications in digital library systems, but are costly and time consuming to manually assign. Keyphrase extraction algorithms are usually evaluated by comparison to author-specified keywords, but this methodology has several well-known shortcomings. The results presented in this paper are based on subjective evaluations of the quality and appropriateness of keyphrases by human assessors, and make a number of contributions. First, they validate previous evaluations of Kea that rely on author keywords. Second, they show Kea's performance is comparable to that of similar systems that have been evaluated by human assessors. Finally, they justify the use of author keyphrases as a performance metric by showing that authors generally choose good keywords.
      Date
      2001-02-01
      Type
      Working Paper
      Series
      Computer Science Working Papers
      Report No.
      01/2
      Publisher
      University of Waikato, Department of Computer Science
      Collections
      • 2001 Working Papers [5]
      Show full item record  

      Usage

      Downloads, last 12 months
      84
       
       

      Usage Statistics

      For this itemFor all of Research Commons

      The University of Waikato - Te Whare Wānanga o WaikatoFeedback and RequestsCopyright and Legal Statement