Track:
Contents
Downloads:
Abstract:
This paper reports early results at the intersection of knowledge and language acquisition. Humans learn much by reading, a capability largely absent from machines. We assume that (1) some conceptual structure exists, represented in an ontology, (2) a handful of examples of each concept and relation is provided, and (3) the machine knows the grammatical structure and semantic structure of the language. The task is to learn the many ways that the concepts and relations are expressed so that a machine can automatically map from source text to the knowledge base. As a case study we looked at the relations invent(inventor, invention), employ(employer, employee), and located-at(entity, location). Our results show that structural features, e.g., dependency parses and propositions (predicate argument structure), outperform non-structural features, e.g., strings of words.