EWGAN: Entropy-Based Wasserstein GAN for Imbalanced Learning

Authors

  • Jinfu Ren Hong Kong Baptist University
  • Yang Liu Hong Kong Baptist University
  • Jiming Liu Hong Kong Baptist University

DOI:

https://doi.org/10.1609/aaai.v33i01.330110011

Abstract

In this paper, we propose a novel oversampling strategy dubbed Entropy-based Wasserstein Generative Adversarial Network (EWGAN) to generate data samples for minority classes in imbalanced learning. First, we construct an entropyweighted label vector for each class to characterize the data imbalance in different classes. Then we concatenate this entropyweighted label vector with the original feature vector of each data sample, and feed it into the WGAN model to train the generator. After the generator is trained, we concatenate the entropy-weighted label vector with random noise feature vectors, and feed them into the generator to generate data samples for minority classes. Experimental results on two benchmark datasets show that the samples generated by the proposed oversampling strategy can help to improve the classification performance when the data are highly imbalanced. Furthermore, the proposed strategy outperforms other state-of-the-art oversampling algorithms in terms of the classification accuracy.

Downloads

Published

2019-07-17

How to Cite

Ren, J., Liu, Y., & Liu, J. (2019). EWGAN: Entropy-Based Wasserstein GAN for Imbalanced Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 10011-10012. https://doi.org/10.1609/aaai.v33i01.330110011

Issue

Section

Student Abstract Track