Distributed Representations for Arithmetic Word Problems

Authors

  • Sowmya S Sundaram Indian Institute of Technology-Madras
  • Deepak P Queen's University, Belfast
  • Savitha Sam Abraham Indian Institute of Technology-Madras

DOI:

https://doi.org/10.1609/aaai.v34i05.6432

Abstract

We consider the task of learning distributed representations for arithmetic word problems. We outline the characteristics of the domain of arithmetic word problems that make generic text embedding methods inadequate, necessitating a specialized representation learning method to facilitate the task of retrieval across a wide range of use cases within online learning platforms. Our contribution is two-fold; first, we propose several 'operators' that distil knowledge of the domain of arithmetic word problems and schemas into word problem transformations. Second, we propose a novel neural architecture that combines LSTMs with graph convolutional networks to leverage word problems and their operator-transformed versions to learn distributed representations for word problems. While our target is to ensure that the distributed representations are schema-aligned, we do not make use of schema labels in the learning process, thus yielding an unsupervised representation learning method. Through an evaluation on retrieval over a publicly available corpus of word problems, we illustrate that our framework is able to consistently improve upon contemporary generic text embeddings in terms of schema-alignment.

Downloads

Published

2020-04-03

How to Cite

Sundaram, S. S., P, D., & Abraham, S. S. (2020). Distributed Representations for Arithmetic Word Problems. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 9000-9007. https://doi.org/10.1609/aaai.v34i05.6432

Issue

Section

AAAI Technical Track: Natural Language Processing