Pruning feature extractor stacking for cross-domain few-shot learning

Abstract

Combining knowledge from source domains to learn efficiently from a few labelled instances in a target domain is a transfer learning problem known as cross-domain few-shot learning (CDFSL). Feature extractor stacking (FES) is a state-of-the-art CDFSL method that maintains a collection of source domain feature extractors instead of a single universal extractor. FES uses stacked generalisation to build an ensemble from extractor snapshots saved during target domain fine-tuning. It outperforms several contemporary universal model-based CDFSL methods in the Meta-Dataset benchmark. However, it incurs higher storage cost because it saves a snapshot for every fine-tuning iteration for every extractor. In this work, we propose a bidirectional snapshot selection strategy for FES, leveraging its cross-validation process and the ordered nature of its snapshots, and demonstrate that a 95% snapshot reduction can be achieved while retaining the same level of accuracy.

Citation

Wang, H., Frank, E., Pfahringer, B., & Holmes, G. (2025). Pruning feature extractor stacking for cross-domain few-shot learning. Transactions on Machine Learning Research.

Series name

Date

Publisher

Degree

Type of thesis

Supervisor

DOI

Link to supplementary material

Research Projects

Organizational Units

Journal Issue