Loading...
Thumbnail Image
Item

Dimension-adaptive bounds on compressive FLD Classification

Abstract
Efficient dimensionality reduction by random projections (RP) gains popularity, hence the learning guarantees achievable in RP spaces are of great interest. In finite dimensional setting, it has been shown for the compressive Fisher Linear Discriminant (FLD) classifier that forgood generalisation the required target dimension grows only as the log of the number of classes and is not adversely affected by the number of projected data points. However these bounds depend on the dimensionality d of the original data space. In this paper we give further guarantees that remove d from the bounds under certain conditions of regularity on the data density structure. In particular, if the data density does not fill the ambient space then the error of compressive FLD is independent of the ambient dimension and depends only on a notion of ‘intrinsic dimension'.
Type
Conference Contribution
Type of thesis
Series
Citation
Kaban, A., & Durrant, R. J. (2013). Dimension-adaptive bounds on compressive FLD Classification. In S. Jain, R. Munos, F. Stephan, & T. Zeugmann (Eds.), Proceedings of 24th International Conference on Algorithmic Learning Theory, Vol. LNAI 8139(pp. 294–308). Berlin, Germany: Springer Berlin Heidelberg. http://doi.org/10.1007/978-3-642-40935-6_21
Date
2013
Publisher
Springer Berlin Heidelberg
Degree
Supervisors
Rights
This is an author’s accepted version of a paper published in Algorithmic Learning Theory; 24th International Conference, ALT 2013, Singapore, October 6-9, 2013. Proceedings. © Springer-Verlag Berlin Heidelberg 2013.