Proceedings:
Book One
Volume
Issue:
Proceedings of the AAAI Conference on Artificial Intelligence, 19
Track:
Learning
Downloads:
Abstract:
Context-free grammars cannot be identified in the limit from positive examples, yet natural language grammars are more powerful than context-free grammars and humans learn them with remarkable ease from positive examples. Identifiability results for formal languages ignore a potentially powerful source of information available to learners of natural languages, namely, meanings. This paper explores the learnability of syntax (i.e. context-free grammars) given positive examples and knowledge of lexical semantics, and the learnability of lexical semantics given knowledge of syntax. The long-term goal is to develop an approach to learning both syntax and semantics that bootstraps itself, using limited knowledge about syntax to infer additional knowledge about semantics, and limited knowledge about semantics to infer additional knowledge about syntax.
AAAI
Proceedings of the AAAI Conference on Artificial Intelligence, 19