Analyzing Compositionality-Sensitivity of NLI Models

Authors

  • Yixin Nie University of North Carolina at Chapel Hill
  • Yicheng Wang University of North Carolina at Chapel Hill
  • Mohit Bansal University of North Carolina at Chapel Hill

DOI:

https://doi.org/10.1609/aaai.v33i01.33016867

Abstract

Success in natural language inference (NLI) should require a model to understand both lexical and compositional semantics. However, through adversarial evaluation, we find that several state-of-the-art models with diverse architectures are over-relying on the former and fail to use the latter. Further, this compositionality unawareness is not reflected via standard evaluation on current datasets. We show that removing RNNs in existing models or shuffling input words during training does not induce large performance loss despite the explicit removal of compositional information. Therefore, we propose a compositionality-sensitivity testing setup that analyzes models on natural examples from existing datasets that cannot be solved via lexical features alone (i.e., on which a bag-of-words model gives a high probability to one wrong label), hence revealing the models’ actual compositionality awareness. We show that this setup not only highlights the limited compositional ability of current NLI models, but also differentiates model performance based on design, e.g., separating shallow bag-of-words models from deeper, linguistically-grounded tree-based models. Our evaluation setup is an important analysis tool: complementing currently existing adversarial and linguistically driven diagnostic evaluations, and exposing opportunities for future work on evaluating models’ compositional understanding.

Downloads

Published

2019-07-17

How to Cite

Nie, Y., Wang, Y., & Bansal, M. (2019). Analyzing Compositionality-Sensitivity of NLI Models. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 6867-6874. https://doi.org/10.1609/aaai.v33i01.33016867

Issue

Section

AAAI Technical Track: Natural Language Processing