Light Multi-Segment Activation for Model Compression

Authors

  • Zhenhui Xu Peking University
  • Guolin Ke Microsoft Research
  • Jia Zhang Microsoft Research
  • Jiang Bian Microsoft Research
  • Tie-Yan Liu Microsoft Research

DOI:

https://doi.org/10.1609/aaai.v34i04.6128

Abstract

Model compression has become necessary when applying neural networks (NN) into many real application tasks that can accept slightly-reduced model accuracy but with strict tolerance to model complexity. Recently, Knowledge Distillation, which distills the knowledge from well-trained and highly complex teacher model into a compact student model, has been widely used for model compression. However, under the strict requirement on the resource cost, it is quite challenging to make student model achieve comparable performance with the teacher one, essentially due to the drastically-reduced expressiveness ability of the compact student model. Inspired by the nature of the expressiveness ability in NN, we propose to use multi-segment activation, which can significantly improve the expressiveness ability with very little cost, in the compact student model. Specifically, we propose a highly efficient multi-segment activation, called Light Multi-segment Activation (LMA), which can rapidly produce multiple linear regions with very few parameters by leveraging the statistical information. With using LMA, the compact student model is capable of achieving much better performance effectively and efficiently, than the ReLU-equipped one with same model complexity. Furthermore, the proposed method is compatible with other model compression techniques, such as quantization, which means they can be used jointly for better compression performance. Experiments on state-of-the-art NN architectures over the real-world tasks demonstrate the effectiveness and extensibility of the LMA.

Downloads

Published

2020-04-03

How to Cite

Xu, Z., Ke, G., Zhang, J., Bian, J., & Liu, T.-Y. (2020). Light Multi-Segment Activation for Model Compression. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 6542-6549. https://doi.org/10.1609/aaai.v34i04.6128

Issue

Section

AAAI Technical Track: Machine Learning