Semi-Supervised Learning under Class Distribution Mismatch

Authors

  • Yanbei Chen Queen Mary University of London
  • Xiatian Zhu Queen Mary University of London
  • Wei Li Queen Mary University of London
  • Shaogang Gong Queen Mary University of London

DOI:

https://doi.org/10.1609/aaai.v34i04.5763

Abstract

Semi-supervised learning (SSL) aims to avoid the need for collecting prohibitively expensive labelled training data. Whilst demonstrating impressive performance boost, existing SSL methods artificially assume that small labelled data and large unlabelled data are drawn from the same class distribution. In a more realistic scenario with class distribution mismatch between the two sets, they often suffer severe performance degradation due to error propagation introduced by irrelevant unlabelled samples. Our work addresses this under-studied and realistic SSL problem by a novel algorithm named Uncertainty-Aware Self-Distillation (UASD). Specifically, UASD produces soft targets that avoid catastrophic error propagation, and empower learning effectively from unconstrained unlabelled data with out-of-distribution (OOD) samples. This is based on joint Self-Distillation and OOD filtering in a unified formulation. Without bells and whistles, UASD significantly outperforms six state-of-the-art methods in more realistic SSL under class distribution mismatch on three popular image classification datasets: CIFAR10, CIFAR100, and TinyImageNet.

Downloads

Published

2020-04-03

How to Cite

Chen, Y., Zhu, X., Li, W., & Gong, S. (2020). Semi-Supervised Learning under Class Distribution Mismatch. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 3569-3576. https://doi.org/10.1609/aaai.v34i04.5763

Issue

Section

AAAI Technical Track: Machine Learning