Generalized Batch Normalization: Towards Accelerating Deep Neural Networks

Authors

  • Xiaoyong Yuan University of Florida
  • Zheng Feng University of Florida
  • Matthew Norton Naval Postgraduate School
  • Xiaolin Li University of Florida

DOI:

https://doi.org/10.1609/aaai.v33i01.33011682

Abstract

Utilizing recently introduced concepts from statistics and quantitative risk management, we present a general variant of Batch Normalization (BN) that offers accelerated convergence of Neural Network training compared to conventional BN. In general, we show that mean and standard deviation are not always the most appropriate choice for the centering and scaling procedure within the BN transformation, particularly if ReLU follows the normalization step. We present a Generalized Batch Normalization (GBN) transformation, which can utilize a variety of alternative deviation measures for scaling and statistics for centering, choices which naturally arise from the theory of generalized deviation measures and risk theory in general. When used in conjunction with the ReLU non-linearity, the underlying risk theory suggests natural, arguably optimal choices for the deviation measure and statistic. Utilizing the suggested deviation measure and statistic, we show experimentally that training is accelerated more so than with conventional BN, often with improved error rate as well. Overall, we propose a more flexible BN transformation supported by a complimentary theoretical framework that can potentially guide design choices.

Downloads

Published

2019-07-17

How to Cite

Yuan, X., Feng, Z., Norton, M., & Li, X. (2019). Generalized Batch Normalization: Towards Accelerating Deep Neural Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 1682-1689. https://doi.org/10.1609/aaai.v33i01.33011682

Issue

Section

AAAI Technical Track: Constraint Satisfaction and Optimization