Research Commons
      • Browse 
        • Communities & Collections
        • Titles
        • Authors
        • By Issue Date
        • Subjects
        • Types
        • Series
      • Help 
        • About
        • Collection Policy
        • OA Mandate Guidelines
        • Guidelines FAQ
        • Contact Us
      • My Account 
        • Sign In
        • Register
      View Item 
      •   Research Commons
      • University of Waikato Research
      • Computing and Mathematical Sciences
      • Computing and Mathematical Sciences Papers
      • View Item
      •   Research Commons
      • University of Waikato Research
      • Computing and Mathematical Sciences
      • Computing and Mathematical Sciences Papers
      • View Item
      JavaScript is disabled for your browser. Some features of this site may not work without it.

      Bagging ensemble selection for regression

      Sun, Quan; Pfahringer, Bernhard
      Thumbnail
      Files
      ai12_quan_bagging_ensemble_selection_for_regression.pdf
      229.8Kb
      DOI
       10.1007/978-3-642-35101-3_59
      Find in your library  
      Citation
      Export citation
      Sun, Q. & Pfahringer, B. (2012). Bagging ensemble selection for regression. In Lecture Notes in Computer Science 2012, Volume 7691, Al 2012: Advances in Artificial Intelligence. Pp 695-706.
      Permanent Research Commons link: https://hdl.handle.net/10289/6940
      Abstract
      Bagging ensemble selection (BES) is a relatively new ensemble learning strategy. The strategy can be seen as an ensemble of the ensemble selection from libraries of models (ES) strategy. Previous experimental results on binary classification problems have shown that using random trees as base classifiers, BES-OOB (the most successful variant of BES) is competitive with (and in many cases, superior to) other ensemble learning strategies, for instance, the original ES algorithm, stacking with linear regression, random forests or boosting. Motivated by the promising results in classification, this paper examines the predictive performance of the BES-OOB strategy for regression problems. Our results show that the BES-OOB strategy outperforms Stochastic Gradient Boosting and Bagging when using regression trees as the base learners. Our results also suggest that the advantage of using a diverse model library becomes clear when the model library size is relatively large. We also present encouraging results indicating that the non negative least squares algorithm is a viable approach for pruning an ensemble of ensembles.
      Date
      2012
      Type
      Conference Contribution
      Publisher
      Springer
      Rights
      This is the author's accepted version. The original publication is available at www.springerlink.com. Copyright Springer-Verlag Berlin Heidelberg 2012.
      Collections
      • Computing and Mathematical Sciences Papers [1454]
      Show full item record  

      Usage

      Downloads, last 12 months
      98
       
       
       

      Usage Statistics

      For this itemFor all of Research Commons

      The University of Waikato - Te Whare Wānanga o WaikatoFeedback and RequestsCopyright and Legal Statement