Research Commons
      • Browse 
        • Communities & Collections
        • Titles
        • Authors
        • By Issue Date
        • Subjects
        • Types
        • Series
      • Help 
        • About
        • Collection Policy
        • OA Mandate Guidelines
        • Guidelines FAQ
        • Contact Us
      • My Account 
        • Sign In
        • Register
      View Item 
      •   Research Commons
      • University of Waikato Theses
      • Masters Degree Theses
      • View Item
      •   Research Commons
      • University of Waikato Theses
      • Masters Degree Theses
      • View Item
      JavaScript is disabled for your browser. Some features of this site may not work without it.

      Effective Linear-Time Feature Selection

      Pradhananga, Nripendra
      Thumbnail
      Files
      thesis.pdf
      1.821Mb
      Citation
      Export citation
      Pradhananga, N. (2007). Effective Linear-Time Feature Selection (Thesis, Master of Science (MSc)). The University of Waikato, Hamilton, New Zealand. Retrieved from https://hdl.handle.net/10289/2315
      Permanent Research Commons link: https://hdl.handle.net/10289/2315
      Abstract
      The classification learning task requires selection of a subset of features to represent patterns

      to be classified. This is because the performance of the classifier and the cost of

      classification are sensitive to the choice of the features used to construct the classifier.

      Exhaustive search is impractical since it searches every possible combination of features.

      The runtime of heuristic and random searches are better but the problem still persists

      when dealing with high-dimensional datasets.

      We investigate a heuristic, forward, wrapper-based approach, called Linear Sequential

      Selection, which limits the search space at each iteration of the feature selection process.

      We introduce randomization in the search space. The algorithm is called Randomized

      Linear Sequential Selection. Our experiments demonstrate that both methods are faster,

      find smaller subsets and can even increase the classification accuracy.

      We also explore the idea of ensemble learning. We have proposed two ensemble creation

      methods, Feature Selection Ensemble and Random Feature Ensemble. Both methods apply

      a feature selection algorithm to create individual classifiers of the ensemble. Our

      experiments have shown that both methods work well with high-dimensional data.
      Date
      2007
      Type
      Thesis
      Degree Name
      Master of Science (MSc)
      Publisher
      The University of Waikato
      Rights
      All items in Research Commons are provided for private study and research purposes and are protected by copyright with all rights reserved unless otherwise indicated.
      Collections
      • Masters Degree Theses [2409]
      Show full item record  

      Usage

      Downloads, last 12 months
      44
       
       

      Usage Statistics

      For this itemFor all of Research Commons

      The University of Waikato - Te Whare Wānanga o WaikatoFeedback and RequestsCopyright and Legal Statement