LATTE: Latent Type Modeling for Biomedical Entity Linking

Authors

  • Ming Zhu Virginia Tech
  • Busra Celikkaya AWS AI
  • Parminder Bhatia AWS AI
  • Chandan K. Reddy Virginia Tech

DOI:

https://doi.org/10.1609/aaai.v34i05.6526

Abstract

Entity linking is the task of linking mentions of named entities in natural language text, to entities in a curated knowledge-base. This is of significant importance in the biomedical domain, where it could be used to semantically annotate a large volume of clinical records and biomedical literature, to standardized concepts described in an ontology such as Unified Medical Language System (UMLS). We observe that with precise type information, entity disambiguation becomes a straightforward task. However, fine-grained type information is usually not available in biomedical domain. Thus, we propose LATTE, a LATent Type Entity Linking model, that improves entity linking by modeling the latent fine-grained type information about mentions and entities. Unlike previous methods that perform entity linking directly between the mentions and the entities, LATTE jointly does entity disambiguation, and latent fine-grained type learning, without direct supervision. We evaluate our model on two biomedical datasets: MedMentions, a large scale public dataset annotated with UMLS concepts, and a de-identified corpus of dictated doctor's notes that has been annotated with ICD concepts. Extensive experimental evaluation shows our model achieves significant performance improvements over several state-of-the-art techniques.

Downloads

Published

2020-04-03

How to Cite

Zhu, M., Celikkaya, B., Bhatia, P., & Reddy, C. K. (2020). LATTE: Latent Type Modeling for Biomedical Entity Linking. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 9757-9764. https://doi.org/10.1609/aaai.v34i05.6526

Issue

Section

AAAI Technical Track: Natural Language Processing