Learning Multi-Level Dependencies for Robust Word Recognition

Authors

  • Zhiwei Wang Michigan State University
  • Hui Liu Michigan State University
  • Jiliang Tang Michigan State University
  • Songfan Yang TAL Education Group
  • Gale Yan Huang TAL Education Group
  • Zitao Liu TAL Education Group

DOI:

https://doi.org/10.1609/aaai.v34i05.6463

Abstract

Robust language processing systems are becoming increasingly important given the recent awareness of dangerous situations where brittle machine learning models can be easily broken with the presence of noises. In this paper, we introduce a robust word recognition framework that captures multi-level sequential dependencies in noised sentences. The proposed framework employs a sequence-to-sequence model over characters of each word, whose output is given to a word-level bi-directional recurrent neural network. We conduct extensive experiments to verify the effectiveness of the framework. The results show that the proposed framework outperforms state-of-the-art methods by a large margin and they also suggest that character-level dependencies can play an important role in word recognition. The code of the proposed framework and the major experiments are publicly available1.

Downloads

Published

2020-04-03

How to Cite

Wang, Z., Liu, H., Tang, J., Yang, S., Huang, G. Y., & Liu, Z. (2020). Learning Multi-Level Dependencies for Robust Word Recognition. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 9250-9257. https://doi.org/10.1609/aaai.v34i05.6463

Issue

Section

AAAI Technical Track: Natural Language Processing