Loading...
Thumbnail Image
Publication

Self-trained centroid classifiers for semi-supervised cross-domain few-shot learning

Abstract
State-of-the-art cross-domain few-shot learning methods for image classification apply knowledge transfer by fine-tuning deep feature extractors obtained from source domains on the small labelled dataset available for the target domain, generally in conjunction with a simple centroid-based classification head. Semi-supervised learning during the meta-test phase is an obvious approach to incorporating unlabelled data into cross-domain few-shot learning, but semi-supervised methods designed for larger sets of labelled data than those available in few-shot learning appear to easily go astray when applied in this setting. We propose an efficient semi-supervised learning method that applies self-training to the classification head only and show that it can yield very consistent improvements in average performance in the Meta-Dataset benchmark for cross-domain few-shot learning when applied with contemporary methods utilising centroid-based classification.
Type
Conference Contribution
Type of thesis
Series
Citation
Date
2023
Publisher
PMLR
Degree
Supervisors
Rights
This is an author’s accepted version of a conference paper published in Proc 2nd Conference on Lifelong Learning Agents (CoLLAs 2023), PMLR 232. © 2023 the authors and PMLR.