Improving Entity Linking by Modeling Latent Entity Type Information

Authors

  • Shuang Chen Harbin Institute of Technology
  • Jinpeng Wang Microsoft Research Asia
  • Feng Jiang Harbin Institute of Technology
  • Chin-Yew Lin Microsoft Research Asia

DOI:

https://doi.org/10.1609/aaai.v34i05.6251

Abstract

Existing state of the art neural entity linking models employ attention-based bag-of-words context model and pre-trained entity embeddings bootstrapped from word embeddings to assess topic level context compatibility. However, the latent entity type information in the immediate context of the mention is neglected, which causes the models often link mentions to incorrect entities with incorrect type. To tackle this problem, we propose to inject latent entity type information into the entity embeddings based on pre-trained BERT. In addition, we integrate a BERT-based entity similarity score into the local context model of a state-of-the-art model to better capture latent entity type information. Our model significantly outperforms the state-of-the-art entity linking models on standard benchmark (AIDA-CoNLL). Detailed experiment analysis demonstrates that our model corrects most of the type errors produced by the direct baseline.

Downloads

Published

2020-04-03

How to Cite

Chen, S., Wang, J., Jiang, F., & Lin, C.-Y. (2020). Improving Entity Linking by Modeling Latent Entity Type Information. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 7529-7537. https://doi.org/10.1609/aaai.v34i05.6251

Issue

Section

AAAI Technical Track: Natural Language Processing