Research Commons
      • Browse 
        • Communities & Collections
        • Titles
        • Authors
        • By Issue Date
        • Subjects
        • Types
        • Series
      • Help 
        • About
        • Collection Policy
        • OA Mandate Guidelines
        • Guidelines FAQ
        • Contact Us
      • My Account 
        • Sign In
        • Register
      View Item 
      •   Research Commons
      • University of Waikato Research
      • Computing and Mathematical Sciences
      • Computing and Mathematical Sciences Papers
      • View Item
      •   Research Commons
      • University of Waikato Research
      • Computing and Mathematical Sciences
      • Computing and Mathematical Sciences Papers
      • View Item
      JavaScript is disabled for your browser. Some features of this site may not work without it.

      Dimension-adaptive bounds on compressive FLD Classification

      Kabán, Ata; Durrant, Robert J.
      Thumbnail
      Files
      alt13_v3.pdf
      Submitted version, 202.5Kb
      DOI
       10.1007/978-3-642-40935-6_21
      Find in your library  
      Citation
      Export citation
      Kaban, A., & Durrant, R. J. (2013). Dimension-adaptive bounds on compressive FLD Classification. In S. Jain, R. Munos, F. Stephan, & T. Zeugmann (Eds.), Proceedings of 24th International Conference on Algorithmic Learning Theory, Vol. LNAI 8139(pp. 294–308). Berlin, Germany: Springer Berlin Heidelberg. http://doi.org/10.1007/978-3-642-40935-6_21
      Permanent Research Commons link: https://hdl.handle.net/10289/8939
      Abstract
      Efficient dimensionality reduction by random projections (RP) gains popularity, hence the learning guarantees achievable in RP spaces are of great interest. In finite dimensional setting, it has been shown for the compressive Fisher Linear Discriminant (FLD) classifier that forgood generalisation the required target dimension grows only as the log of the number of classes and is not adversely affected by the number of projected data points. However these bounds depend on the dimensionality d of the original data space. In this paper we give further guarantees that remove d from the bounds under certain conditions of regularity on the data density structure. In particular, if the data density does not fill the ambient space then the error of compressive FLD is independent of the ambient dimension and depends only on a notion of ‘intrinsic dimension'.
      Date
      2013
      Type
      Conference Contribution
      Publisher
      Springer Berlin Heidelberg
      Rights
      This is an author’s accepted version of a paper published in Algorithmic Learning Theory;

      24th International Conference, ALT 2013, Singapore, October 6-9, 2013. Proceedings. © Springer-Verlag Berlin Heidelberg 2013.
      Collections
      • Computing and Mathematical Sciences Papers [1452]
      Show full item record  

      Usage

      Downloads, last 12 months
      92
       
       
       

      Usage Statistics

      For this itemFor all of Research Commons

      The University of Waikato - Te Whare Wānanga o WaikatoFeedback and RequestsCopyright and Legal Statement