Proceedings:
No. 3: AAAI-22 Technical Tracks 3
Volume
Issue:
Proceedings of the AAAI Conference on Artificial Intelligence, 36
Track:
AAAI Technical Track on Computer Vision III
Downloads:
Abstract:
Batch Normalization (BN) as an important component assists Deep Neural Networks in achieving promising performance for extensive learning tasks by scaling distribution of feature representations within mini-batches. However, the application of BN suffers from performance degradation under the scenario of Unsupervised Domain Adaptation (UDA), since the estimated statistics fail to concurrently describe two different domains. In this paper, we develop a novel normalization technique, named Collaborative Normalization (CoN), for eliminating domain discrepancy and accelerating the model training of neural networks for UDA. Unlike typical strategies only exploiting domain-specific statistics during normalization, our CoN excavates cross-domain knowledge and simultaneously scales features from various domains by mimicking the merits of collaborative representation. Our CoN can be easily plugged into popular neural network backbones for cross-domain learning. On the one hand, theoretical analysis guarantees that models with CoN promote discriminability of feature representations and accelerate convergence rate; on the other hand, empirical study verifies that replacing BN with CoN in popular network backbones effectively improves classification accuracy in most learning tasks across three cross-domain visual benchmarks.
DOI:
10.1609/aaai.v36i3.20181
AAAI
Proceedings of the AAAI Conference on Artificial Intelligence, 36