Proceedings:
No. 15: AAAI-21 Technical Tracks 15
Volume
Issue:
Proceedings of the AAAI Conference on Artificial Intelligence, 35
Track:
AAAI Technical Track on Speech and Natural Language Processing II
Downloads:
Abstract:
Lexical relations describe how concepts are semantically related, in the form of relation triples. The accurate prediction of lexical relations between concepts is challenging, due to the sparsity of patterns indicating the existence of such relations. We propose the Knowledge-Enriched Meta-Learning (KEML) framework to address lexical relation classification. In KEML, the LKB-BERT (Lexical Knowledge Base-BERT) model is first presented to learn concept representations from text corpora, with rich lexical knowledge injected by distant supervision. A probabilistic distribution of auxiliary tasks is defined to increase the model's ability to recognize different types of lexical relations. We further propose a neural classifier integrated with special relation recognition cells, in order to combine meta-learning over the auxiliary task distribution and supervised learning for LRC. Experiments over multiple datasets show KEML outperforms state-of-the-art methods.
DOI:
10.1609/aaai.v35i15.17640
AAAI
Proceedings of the AAAI Conference on Artificial Intelligence, 35